The command being used (for example, "build", "restore")
The ExitCode of the command
For test projects, the test runner being used
The timestamp of invocation
The framework used
Whether runtime IDs are present in the "runtimes" node
The CLI version being used
Here is the telemetry code itself: https://github.com/dotnet/cli/blob/5a37290f24aba5d35f3f95830...
They also publish all the telemetry data (Change 2016 and q3): https://dotnetcli.blob.core.windows.net/usagedata/dotnet-cli...
Welcome to .NET Core!
Learn more about .NET Core @ https://aka.ms/dotnet-docs. Use dotnet --help to see available commands or go to https://aka.ms/dotnet-cli-docs.
The .NET Core tools collect usage data in order to improve your experience.
The data is anonymous and does not include command-line arguments. The data is collected by Microsoft and shared with the community.
You can opt out of telemetry by setting a DOTNET_CLI_TELEMETRY_OPTOUT environment variable to 1 using your favorite shell.
You can read more about .NET Core tools telemetry @ https://aka.ms/dotnet-cli-telemetry.
A command is running to initially populate your local package cache, to improve restore speed and enable offline access. This command will take up to a minute to complete and will only happen once.
Hard to believe, but they used to sell products a while ago and had no telemetry.
If you want to see how it's done properly, look at OmniGroup: their apps have toggleable telemetry and it's off by default.
There is difference between collecting information about how many people are using vs whether a particular person is using.
Collecting diagnostic information from windows application failures/how many failures etc are there ever since Windows 95 era.
Similarly, collecting information about how many people are using dotnet core build/test/publish is similar to how Google/Mozilla tracks how many users are running which version of their product and experience issues.
If Microsoft/Google/Mozilla or any other company uses that information to identify a specific person is "effectively spying on you". Until that's not there, the same functionality exists in almost every product. Just click bait article.
Doesn't have to be malicious, doesn't have to be what's legally defined as personal information. The fact that many companies are doing it doesn't make it less inappropriate.
Reputable companies will clearly inform users and ask for their confirmation. Then they respect their choice.
Disreputable companies such as MS or Google take without asking, use dark patterns to trick users, default to always on, reset privacy settings, etc.
I think I'll be happy the day EU and American consumer protection agencies start looking closer into Googles business.
I'd also applaud even more visible information about what exactly gets collected and sent (the old gds "Read very carefully - this is not the usual yadda yadda" would be a good start).
However IMO we shouldn't call legitimate telemetry "spyware". I thing that is what you call "crying wolf".
If you say no, it won't send anything.
No, the settings do not mysteriously reset themselves.
"Someone submitted a PR to Mozilla to fix this, and the Mozilla devs closed it"
The telemetry I was talking about is exactly the one, where you get a bar at the bottom during first launch. Try it, you will see it.
The specifics of a custom deal with Google and the circling of the wagons (specifically opinions expressed by multiple Mozilla employees in an official capacity) prior to reversing course does not strengthen that case.
> If you say no, it won't send anything.
This simply wasn't true; I am glad that the implementation was fixed.
Yeah well you and the author's first clue should have been when you stopped paying for said products.
And in this specific case, it's really not spying, it reveals pretty much nothing about you and help them figure out what is used and what fails.
Or are you making some weird accusations against the FSF and the GNU Project?
Again I am making a huge assumption about correctly selected telemetry data here but opt in mechanisms won't get even 10% of the data they currently do.
Sure ask up front explicitly but don't in passing invoke the first capture before consent has been taken. That's a shitty tactic.
The other things being collected are:
* Geographical location
* Operating system and version
That's not all that matters. IMO the real decision is: do you /trust/ MS ? Do you trust that they anonymize collected data and that they won't secretly change collected data? Do you trust future MS with that information.
> I'm actually OK with this to be honest
That's perfectly fine if you trust them. Many people don't. Personally I wouldn't trust any dev tool that uploads my usage.
You don't need to trust them. The telemetry code is open source AND they release the aggregate data it collects for anyone to use/inspect.
Why do you have to trust MS? You can read the source code to check for yourself whether sensitive information is sent. You don't have to take Microsoft's word for it.
Bear with me. This seems like the wrong question, but not for the reason you might expect. Rather, I think that it might be wrong because, even if Microsoft acts in completely good faith, it is damn near impossible to anonymise collected data properly [obligatory citation of the 'anonymised' AOL search data]. It doesn't matter whether I trust someone to do something if they (probably) can't do it.
> +- Geographical location†
I feel like that's one of the pieces of information I'd expect a new opt-in or notification to appear for at the very least. Did that happen?
just look at the automotive industry in germany. if you give them trust, they probably will do shady stuff, no matter how good their initial behavior was.
never trust a company.
It's not just independent devs that are using .net. And the name of the company appears often in the assembly.
MS could update your OS to do anything tomorrow, Canonical could hide some literal malware in any number of packages for Ubuntu tonight, Intel could write a backdoor into your machine in it's next microcode update.
And OSS doesn't fully prevent this either. GCC could add some kind of nefarious exploit in the next version of it's compiler (knowingly or otherwise). Just take a look at the underhanded c competition for just how scary easy it is to hide exploits in plain sight!
I can't even fathom the amount of work it would be to personally review every line of code that goes into your machine from the microcode up to the newest NPM module (even if it were all open and it was possible to do). At some point you need to trust someone else.
You're right - there isn't enough time to audit everything, so we have to rely on trust. "Relying on trust" means instead of reviewing code, you have to review trustworthiness.
1. It's setting a bad precedence for data collection by default. Name one other tool of the same class that actually sends telemetry data home by default?
2. It's much harder to ensure that the tooling is compliant with data protection policies within an organisation if the tooling by default sends telemetry. We now have to assume it's going to send stuff by default and configure all build infrastructure, every developer workstation and every piece of the toolchain independently. This is particularly of concern in the finance sector. It also costs us time and money.
3. There's no test cases to cover the telemetry functionality at all. Check the code. What happens if it starts reporting command lines due to a trivial defect.
4. There is a crudely defined document which describes what the telemetry does, but not what it will do in the future. What happens is a PR appears, gets merged and gets pushed out to a new version. To find out what happens you have to read every merge, every PR for a release.
This is a loaded gun waiting for any security conscious team to shoot themselves in the face with. Really this will gate the product into the bin at the first technical review stage for a lot of companies. There is no appetite for being milked.
I'd also like to add the absolute zero communications on this front from MSFT. People have asked directly via PRs to turn this off because they do not want it and they have been ignored for over a year. The usual response from MSFT is never to respond directly to this question and instead outline what the telemetry does expecting the question to remain answered. If there's anything I've learned over the years; you can't trust anyone who won't answer a direct question.
No, this is not that. The "tu quoque" logical fallacy follows this pattern (from Wikipedia):
Person A makes claim X.
Person B asserts that A's actions or past claims are inconsistent with the truth of claim X.
Therefore X is false.
1. It was announced in the open in June 2016 that .NET Core includes telemetry: https://blogs.msdn.microsoft.com/dotnet/2016/06/27/announcin...
2. If you use something you could at least follow changes between major releases, no?
When did engineer stop being responsible people and read before using things? :-O
So, the latest would be:
What happens if you accidentally paste an AWS secret key or similar in the middle of a command line argument? Will that too appear in public csv files a year later?
What happens if you accidentally paste an AWS secret key or similar in the middle of a command verb? Will that too appear in public csv files a year later?
> Only known arguments and options will be collected (not arbitrary strings).
We don't want your AWS secret key in this data as much as you do. We have put systematic mitigations in place to ensure that this doesn't happen.
We are turning on telemetry in the next release for our open source tool. https://github.com/getgauge/gauge
We are small team with limited resources.
In our tool, it's easy to turn telemetry off, inspect what data is sent and the data collected is public.
The data "really" helps to make the tool better and an opt-in skews the data.
We've published an blog post https://blog.getgauge.io/why-we-collect-data-b19df366b677 and will put it up in the release notes and the download section.
What else can be done so that users don't blow up?
If you get an uptake of say 5-10%, if that's worth it then problem solved. If it's not then don't bother adding telemetry to start with.
But before you do this, you have to ask the question: how did the software industry get by before the sudden rise of telemetry? It engaged the customer.
I think a lot of cases it is used it is used as a substitute for engaging the customer.
If adding telemetry is faster and easier than engaging with the customer, then you'll see projects that add telemetry that wouldn't otherwise have the bandwidth to engage with the customer.
In general, I think the best way to go is to ask in the installer or initial setup, whether you want to send telemetry, and have a sane default according to whether you gather potentially personal information (location? personal, commands run (without args), not personal).
Example Prompt: Send telemetry (commands used, version) (y/n)[y]:
An additional flag for non-interactive installs can solve the problem, but that's a broken setup experience, someone has to look up the documentation after a failure to install.
Turning it off by default in case of a CI/CD setup means losing most of the data.
I'd recommend a required installer flag forcing the user to make a decision, but I'm a user who generally leaves telemetry on.
Here's what we are doing next release (out this week)
* Ask if the user wants to opt out in the graphical installers.
* Print a message after non-graphical installs about data collection and link to documentation on how to turn it off.
OTOH nobody gets around a firewall which blocks all outgoing connections ;)
This tool compiles code. Why does it need to make a network call at all? That's going to slow down your builds for the sake of phoning home to Microsoft, a company we don't exactly trust for being good stewards of our information.
I think they should ask people like Yeoman, but I don't think they deserve this much shit for such a small thing.
So the next version of Bash should have telemetry?
The difference is expectation. I expect websites to run things I don't control. I expect a local application to behave in a certain way.
The point is, I understand why people dont want telemetry. I don't. I think they should ask before they do it, a lot of people are probably willing to share the data. BUT I also understand why they are doing it and I think they've done it in a good manner still.
You should also think about your expectations, one shouldn't have to expect that every site is trying to track you.
> and basically any site today does more intrusive telemetry
has absolutely nothing to do with a local application.
They are just ignoring to let the issue die silently.
"I don’t want your tools spying on you either." how virtuous. Some people don't care though, some people actually prefer it
Then it won't be a problem to disclose exactly what is proposed, get those people's informed consent, and leave everyone else alone, will it?
Do they want to use this data to create a good tool?
Or do they want to use the data to create a tool that appeals to the average user?
I still don't think this is a non problem. When you are using many different tools that are updating constantly, it is easy to not notice one adding telemetry. And even if you disable it, it very well may be silently reenabled in the future.
In fact if you use a product why would you want to conserve your 'precious body fluids' (telemetry) instead of helping improve the product? Beats me.
It's in microsofts DNA to build stuff that captures and watches and monitors and logs.
Just because they've started to be more open, won't change the fundamental company attitude and approach to doing things.
Microsoft will simply be bringing more "Microsoftiness" to the open source world. Get used to it, there's more coming cause that's the way they build software.
I would suggest that it is time to rethink some of those outdated assumptions that tools won't spy on you. Microsoft have arrived at the open source party, so open source isn't the same any more, just accept that the world has changed and now it's entirely possible that your open source is logging and watching.