Yea, like the Microsoft Palm PC isn't a rip-off of the Palm Pilot :]
For reference, IE accounted for ~96% of browser use when that whole antitrust case was happening.
In this case, ~90% of web searches go through Google. So they are using the fact that they are the number one web search engine to convince users to switch to Google Chrome.
That would be Microsoft Chrome :]
And why it is paramount to not use any chromium browser whilst we have a choice to. Luckily, you don't have to give up on anything.
Except, you know, Microsoft products. Like Skype.
Regardless, sure, use a chromium browser for that task if you absolutely must. It is good separation anyway.
> diagnostic data collection (telemetry) is not enabled for private builds
> this data collection is covered by windows 10 privacy, You can find the windows 10 privacy statement and details of controlling the diagnostic and feedback settings here.
So if you build from source, you can disable it, and if you don't build from source but install it from the store, then telemetry is controlled by the central privacy settings in Windows 10.
Presumably this would be a problem only if you specifically don't want MS to have telemetry from winget, but you also specicifically want them to have telemetry on the rest of your OS, which would be... weird.
For Windows Home and Professional users, this is not the case. Disabling telemetry is not possible because Microsoft have decided that there is a minimum "required" amount of telemetry every OS installation must send in order to function.
If the telemetry description stated that some Windows users are able to opt out, they'd be correct.
It's just another example of Microsoft showing they couldn't give a rat's ass about their customers' wishes and that you'll just have to deal with them tracking everything you do.
And who knows? Maybe I do like to contribute to Microsoft about kernel bluescreens so that Windows can get more stable, but do not wish to upload a report every time I install or uninstall software? Why would it be strange that I don't like to share some telemetry but not all telemetry?
Yes but AFAIK, winget telemetry fits entirely under the optional category. No metrics of winget fits under the "required" category that you cannot disable.
In other words, you can completly disable the telemetry of winget, which the title of this article say you cannot.
Or if you build the tool yourself.
Here's one of the elements that is required (and so it cannot be disabled):
Information about your device, its settings and capabilities, including applications and drivers installed on your device
So while I agree with you that the required telemetry is obtrusive, it's not new and it's not related to winget.
This project collects usage data and sends it to Microsoft to help improve our products and services
If they open the whole business to transparency, they become victims of commerical espionage and expose their users' data to invasion.
If that is such a big task maybe not doing it in the first place is the reasonable thing to do.
MS adheres to GDPR export/delete requirements. Failure to adhere risks litigation so it's not something that MS is just going to willy-nilly flaunt.
It's frustrating reading a lot of comments that are of the form "Why doesn't MS just do XYZ?" from individuals who don't seem to know that MS is already doing that.
[disclaimer, MS employee]
I guess this is just Cunningham's law in action.
> Why doesn't MS just stop forcibly collecting data from my machine?
At the minimum data collection level for client SKU (set via user settings), MS asserts that only data needed to keep the machine running and secure is collected (i.e, updates). They really do try hard to honor this assert, although I understand I'm not going to be able to entirely eliminate skepticism.
You can argue that you should be allowed to disable telemetry for updates, but I think that's a different conversation. MS didn't want Windows reputation to suffer due to undeployed fixes (this was a big problem prior to win8), and they took a hard line on the updates.
Enterprise SKUs have a zero exhaust option (no telemetry), but alas, it's not free.
I think this phrase is going to be a contentious point anyway. Microsoft doesn't need any telemetry to "keep the machine running" - as trivially evidenced by me pulling out the Ethernet plug, disconnecting the wireless adapter, and booting up Windows 10. The OS will run just fine. As for keeping it secure, it depends on how much you stretch the word "secure". I can't see any reason why telemetry would be needed to deliver updates (Windows could just download the list of available updates and request the ones it needs). You could argue telemetry is needed for malware detection and mitigation, but that can be stretched to justify everything up to and including uploading snapshots of your hard drive.
I'm guessing "needed to keep the machine secure" is how we've arrived at the rather absurd situation in which Windows Defender is automatically uploading executables it finds on the machine, to have them tested on Microsoft servers - which has been demonstrated to be a neat vector for exfiltrating data from secure corporate networks: the malware can just encode the data it collected into an executable and write it out to the file system, at which point it gets picked up by Defender and uploaded to Microsoft (which corporate firewall will most likely allow), where upon execution it connects to the attacker's server and delivers collected data.
This is not entirely correct: for as long as MS has offered updates via the internet, the update service has required some information from the users machine. If we went back to 2003 with the same legal definitions and awareness that we have today, we could have called this information 'telemetry'. The legal definitions and public awareness of the data gathering has changed a lot since then, but the use of 'telemetry' (for lack of a better word) hasn't.
You'll have to forgive my ignorance about Linux because I haven't used it in a long time: I just fundamentally don't see how any update manager (be it chocolatey, nuget, windows, or linux etc) can efficiently update a client without knowing what's already on the client. If any installation information is sent from the client to an external entity, it may be considered 'telemetry' and subject to certain laws. Doesn't sudo send such information? (again, forgive my ignorance on the Linux implementation).
Pretty simple. Instead of sending information about what's on the machine to the update server, the client would download the list of available updates from the server, use that list to determine what updates it needs, and request those specific updates from the server.
That's e.g. how apt - the Debian family package manager - works. You `apt update` to update the local cache that lists available packages and their versions; then, you `apt upgrade` to download and install any or all packages that need updating.
You might argue that this is not "efficiently", but it works great and in my personal experience a lot faster, more reliably and more predictably than Windows Update.
(As for sudo: No, unlike e.g. the .NET Core CLI and winget, system utilities don't normally contain extra unrelated functionality to silently collect user data and send it off somewhere. Perhaps one reason Microsoft don't hear more complaints about it from developers is that nobody imagines that you would do such a thing. It's normally not done.)
So sudo, depending on system configuration, does log interactions, but does it locally. The rationale behind this is that Linux heritage is one of multi-user systems, so you want to have the owner/sysadmin of that system to be able to audit the use of superuser privileges by regular users, in particular when they break something. In a Linux system installed on a desktop, or a server you admin yourself, the user is the only person who can access these logs anyway, so it's no different than any other system log.
That these logs would be automatically sent to third parties is unthinkable - in the sense that nobody would even think an operating system would dare send sensitive information out like that.
Wait, what? What broadcasts from my computer after I disable telemetry?
Required telemetry (that cannot be disabled):
* Basic error information to help determine whether problems your device is experiencing can be addressed by the update process.
* Information about your device, its settings and capabilities, including applications and drivers installed on your device, to ascertain whether your device is ready for and compatible with the next operating system or app release and ready for update.
* Logging information from the update process itself to understand how well your device’s updates are proceeding through the stages of downloading, pre-installation, post-installation, post-reboot, and setup.
* Data about the performance of updates on all Windows devices to assess the success of an update’s deployment and to learn device characteristics (e.g., hardware, peripherals, settings, and applications) that are associated with the success or failure of an update.
* Data about which devices have had upgrade failures and why to determine whether to offer the same upgrade again.
The most egregious of that list is, imo, this:
* Information about your device, its settings and capabilities, including applications and drivers installed on your device
Minor correction: if you're using enterprise/education, you can set the telemetry level to "security", which sends less information than what you've listed. https://docs.microsoft.com/en-us/windows/privacy/configure-w...
Also, don't use blockquotes for lists as they don't wrap, making reading on mobile a pain.
How is it possible to know what updates to install without checking this information?
2. Software Update process includes metadata that indicates what local machine conditions should be satisfied to warrant installing a given update.
D. Local machine checks the metadata against its own settings, capabilities, applications, and drivers to determine a matching condition to select and apply relevant updates.
The collection of massive amounts of data is not necessary to provide a collection of available updates and leave it up to a local machine to determine which of those updates it needs to apply.
Even if winget allowed for disabling telemetry entirely, our OS is still collecting telemetry too, so it seems the point is pretty much moot, imo.
The first position seems a bit odd for something that is open source (so presumably you can verify what's being sent). I mean it might be bad to send "I installed product X" or "I used the command X" to a remote server, but on the other hand if I really feel this is a problem would I ever even be using the closed source binaries that the package manager installs, without worrying more what they might do, than what happened when the package manager ran?
Some times I get the feeling that the telemetry thing just became an expression of annoyance with something else entirely, or just the current state of affairs. It's like one of those cultural wars where every battle is so symbolic that everyone forgot what the real issue was ("Why do we worry so much about who uses which bathroom again dad?").
As for why this matters, you need only look at Hong Kong for that answer. And even here in the US, many large companies are kowtowing to the Chinese government and changing their products in the US to toe those lines.
I think that it's incorrect to believe that telemetry allows developers to build "better" software. Web pages have had intrusive telemetry for over a decade now, and they still broadly suck. We're able to navigate the world of hamburger menus, text with poor contrast, and scrolljacking because we're adaptable humans, not because these kinds of changes actually improve the websites.
The problem is that telemetry can only tell you the "what" and "how", but not the "why".
> If you trust the developer
It's not just the developers who gain access to that data. It's everybody at the company. It's 3rd parties who contract with that company. It's everybody who hacks into the data set at the company. And it's every governmental agency from (almost) any country with an overly broad warrant who has access to that data.
See my upstream comments about China and Hong Kong
> No one is making you use that software.
Just as nobody is making an effort to make it simple (or even possible) to make an informed choice about whether I should use that software. Homebrew is a great example. I know that it's sending telemetry to Google only because someone raised a ruckus and used social pressure to make the homebrew devs add in a reasonable amount of consent into the install and update process.
Thankfully, laws are catching up with technology, and I should soon be able to rely upon simply saying "data collection not directly tied to software functionality legally requires my explicit consent, and that consent is not required to run the software".
1. Did this error condition that should not happen happen?
2. How many users did it happen for? (Who is not available for legal reasons)
If you search for tracelogging in the code you'll find they're associated with debugging and errors.
I don't know if you understand how difficult it is to debug programs that fail on someone else's machine. Well that's what telemetry is for.
1. Make telemetry opt-in and ask the user to share data. This is what Apple does
2. Collect telemetry and give the user an option to share that data after a crash or bug. Many companies do this.
3. Make telemetry opt-out but be forthright with what information is sent and why.
4. Don't allow opting out of telemetry but be open and upfront with what is being collected and why.
5. Hide the fact that telemetry is being sent, hide what is being sent, and don't allow the user to opt out.
You should check - apple software still contacts the mothership constantly.
You're just illustrating a single possible use among the universe of actual real world uses of telemetry, such as gathering your personal info to build up your profile to use against you as the company sees fit.
As others have stated throughout this discussion, it's standard practice for over a decade to use systems that don't collect telemetry to keep systems updated and safe. It boggles the mind how some people try to claim that the only way to remain safe with Microsoft products is to be perpetually spied upon.
Maybe there's a correlation between invasive telemetry and not caring about your users.
I'd even go so far as to say "Common Sense".
Both, I imagine. Worried about the contents because the trend of companies pulling in as much data as they can tends to not end well for consumers, and annoyed as it's an invasion of privacy that there is tracking for the many things we do on our personal devices.
Collecting data creates a risk that the data will be intercepted or misused at a later point, and there are often legal requirements regarding data collection and processing too.
Its also the principle.
Lastly its trust. I tolerate some telemetry from Apple and sometimes opt into it with things like Debian. I don't trust Microsoft not to be customer hostile.
I understand why telemetry is valuable to developers and I can accept when it's active by default. But I _insist_ on being able to turn it off.
Hopefully Shutup10 will be updated to disable this telemetry too.
The bathroom comment is off topic flamebait.
Unfortunately, it's often also combined with "I have the right to have my opinion counted", which leads to some very ironic moments where people opt out of telemetry and then complain that their uncountable needs aren't counted.
This also applies to newer releases of powershell, aka PS Core. I haven't tried either, but I guarantee you telemetry in both applications is not opt-in but opt out using some obscure method, if that is even possible.
In any case, the claim that telemetry is necessary to improve anything related to customer experience is ridiculous. Not only is a general data collection unnecessary; it would be more efficient to run some experiments, and be it some opt in A/B tests. Surveillance like the above is encroaching and can easily be abused. The data collected are usually fine-grained enough to allow for some nice fingerprinting of individual users. The potential for abuse is high.
 https://news.ycombinator.com/item?id=19322398, https://news.ycombinator.com/item?id=19324538
The file you've identified produces a local, opt-in event stream that does not leave your machine unless you literally e-mail it to me. It's just got that unfortunate word in the filename that means we're bad guys.
EDIT, upon closer inspection: when this is built as part of the Windows product (which consumes source from this repository) those values may end up in an event stream. In the interest of full disclosure, those events are:
1. Part of the console host (conhost.exe) and covered by the Windows global data collection settings
2. Pertaining to (incomplete, but it's too early in the morning for me to do a full review of this code):
2.a. The number of times each low-level console API was used
2.b. How the legacy Find dialog is being used (long strings, short strings, search direction, number of times)
2.c. Specific settings like font size, how many colors are configured, how big the window and buffer are
When the console host (just C:\windows\system32\conhost.exe, not the new Terminal) exits it emits the following information for processes that had connected to it:
* How many ANSI/VT sequences they used
* How many of the above we understood
* How many of them we did not understand
* The executable stem name (ConsoleApplication1.exe, wsl.exe, cmd.exe)
* How many times we saw that executable
~1-5% of those entries make it into a data pipeline that I believe we stopped looking years ago. These pipelines are usually(?) turned off by the OS, so it's possible that these were rendered inert. Still, though, and because the executable stem name might be a little more exposure than anyone's comfortable with, I've filed https://github.com/microsoft/terminal/issues/6103 to yoink it.
(It's been a long time and I still don't know how to format things properly on Hacker News :))
I am not sure what you are trying to do with your clarification here, but I feel even worse about Terminal and your employer.
> ~1-5% of those entries make it into a data pipeline that I believe we stopped looking years ago.
Can you elaborate? What does that mean. 1 - 5 % of what exactly? Of all records collected? Of the records on my machine? I thought those were localized traces, do they end up on MS servers or not? What criteria are used to reduce the data? You believe that "we stopped looking years ago". What does that even mean? You not sure what confidence interval "believe" accounts for, but the error term is a bit high for my taste. And as for "we stopped looking years ago": I feel offended. You must think that I am very naive (that extends to all readers here, but I will only speak for myself here).
The code is clearly labeled telemetry, yet you claim that this is soley about local traces. If I am to believe that to be true, that makes me distrust your software even more, because then it must be of very poor quality. How do you fix bugs quickly, given that you engineers can't differentiate "local traces" from "telemetry". The latter literally means "to measure from afar". The fact that you opened a GH issue about this makes me want to believe you; the fact that it's locked doesn't; but hey, that's your dev process and community work -- I won't judge that.
As for your next argument: "It's the OS'es fault". I don't care whether the OS vacuums all my data and sends it to your employer, or whether you personally break into my house to exfiltrate my harddrive: I don't want you to obtain my data in any way. This makes it even worse -- the application you're responsible for appears to passively creates some sort of profilable data, which the OS exfiltrates. This does not remove you from the responsibility.
Also: Please consider the bigger issue here. I really want Terminal to be a great piece of software. I would consider myself a Windows fan, if it wasn't for the ongoing disrespect of my privacy; Windows is unmatched in terms of stability and consistency and a proper terminal has been missing.
Your employer claims to gather data as to improve the software I use in my interest; despite the fact that I (and many others) keep telling them that they are REALLY interested in retaining the right to privacy, individualism and secrets.
This is about trust. You keep breaking it. I don't trust you, and I don't trust your employer. When I tell you and your peers so, your answers are evasive (GH issues is not the right place), hazy (oh THAT is done in another component), bureaucratic (look at all these legal statements), irresponsible (it's the OS), or otherwise elusive (that thing clearly labeled Telemetry doesn't do telemetry, because @architechture). You don't understand the problem and you don't really care. I believe you stopped looking at that "pipeline" years ago.
It seemed reasonable considering VSCode is also a Microsoft product with an explicit telemetry option that you can opt out of.
Within 15 minutes the issue was closed and the idea of adding a telemetry option was dismissed by a contributor.
Kind of scary to use something so integral to your day to day as a developer is having that much data being sent out to Microsoft. It's partly why I stick with wsltty (which is equally as fast and has no telemetry).
It turns out that as long as you have Basic telemetry settings in Windows then the Terminal app doesn't send anything out to Microsoft by default.
This comment goes into more details on what is exactly collected and sent to Microsoft if you use "Enhanced" telemetry (which you don't need to use): https://github.com/microsoft/terminal/issues/5331#issuecomme...
Is that sending Microsoft all of my bash/zsh aliases? And what about
If it works how most other Terminals I have used - that is going to send the name of the program I am running or host I am connected to to Microsoft. I think that is pretty invasive if you ask me.
EDIT: I put together a list of what happens in this file in a sibling comment.
It has the airs of an internal mandate. I can't help but be deeply suspicious of this behaviour.
I think this post here today serves only as a rallying cry for "no telemetry" extremists and contributes nothing interesting or curious or relevant to HN that hasn't been covered in hundreds of framing-implied or framing-explicit "telemetry is bad" posts prior.
Unlike people acting on mandate given from above.
Your name calling the position you don't agree with does not help the calm and rational discussion.
I can count on a single hand the number of times a thread has gotten hopelessly out of hand. Most people have reasons for their positions, for or against. That they may not necessarily agree with your own, or be politely phrased enough for your personal tastes doesn't detract from the message.
Software companies have treated the User's machine as their playground for years with little or no resistance, because most of the technically savvy were A)employed by them B) kids or niche enough hobbyists to be safely ignored. or C) cared not a lick as long as things worked.
I've been trying to warn places for years this free-ride will end as soon as larger swathes of the population are both socially and technically savvy, and large groups of people are in a position to put in substantial resistance in terms of implementation behaviors they are willing to support.
I'm a hardliner against hiding anything* from the user. It's creepy, and inspires nil trust. Companies have only themselves to blame when they start taking advantage of user's ignorance. I spend half my time making the various degrees of learned helplessness foisted on people by tech companies a thing of the past for user's I support. It takes time and persistence, but it can be done. Not a one of them appreciates the wool having been pulled over their eyes.
Oh yeah sure, Microsoft is the victim, cyberbullied by their own users. It's a classic case of innocent naive corporations getting senselessly dogpiled by mean common people, clearly that's the way this power dynamic is arranged. Yeah, fucking right.
The whole dotnet core telemetry ticket was a complete shit show which pretty much outlines the developer community are merely subjects.
MS Devs who do venture to discuss it are aware of the complexities and tread very cautiously: any slight misphrase has bad outcomes. Smarter participants in the conversation will merely point you to Microsoft's public facing privacy statement, which is such broad legalese that it's not entirely helpful to the analytically minded individual that wants to learn more.
That said, I've worked with diagnostic data for several years now and I've never seen anything to justify the hatred that MS gets over this. I've been on teams that have spent a lot of effort ensuring that any data is gathered, stored, and used in a respectful and legal manner. I've seen mistakes, but they are disclosed and remedied as part of the business process.
[disclaimer, MS employee, my employer wants me to state that this is my opinion and not Microsoft's]
This solution has been successfully used before, even by Microsoft themselves!
Note that you can still collect crash dumps and usage data (at the risk of reintroducing some of the problems solved by not collecting other people's data) if you just ask nicely (and clearly) and the user decides to allow it. Office and .NET used to do this.
Most people will just accept the defaults, so why piss the rest off?
The insider build requires that you enable full telemetry which includes sending your visited websites to MS. I need WSL2 so I’m just avoiding doing anything private on my personal computer for now.
I understand why the data is useful to them but I don’t think they understand or care why this is an important issue to others
This is madness. It's hard to consider a machine your "personal computer" if you're afraid of doing anything personal on it.
"Personal computer" means one person per computer, vs a mainframe. It's not a privacy statement.
They could just ask for feedback, which is what I naïvely considered the "insider" build for—obviously Microsoft is just spying on you for some reason.
Again, if you don’t want telemetry, don’t use the build Microsoft uses specifically to collect telemetry.
Don’t complain that the build MS uses only for telemetry requires you to provide telemetry.
Not my problem. Sounds like a problem for a company with a QA team.
Insider build is very clearly targetting people that are ok with potential bug and instability and that are ok providing telemetry to Microsoft so they can improve stability.
You are not one of those people, and thus shouldn't use it.
For example: https://discussions.apple.com/thread/251380226 Okay, you can’t send email? Well, did you delete the built-in email app? Do you have your email account set up?
Telemetry aims to fix that by providing context to what was happening when the problem happened. Whether it actually helps software development, I can’t answer.
The point was that Windows 10 2004 is RTM but you have to be a MSDN subscriber to download it, or find some shady Russian website to get it from.
Actually you need Linux, not WSL2. That's the way Microsoft is slowly, sneakily, furtively, planting the idea that their Linux is the real thing, which is absolutely not.
This approach is much much much worse than Ballmer yelling that Linux is cancer, because there is no evident hate, no apparent will to destroy the competition which would raise warnings in the community. It is just Microsoft becoming smarter, then absorbing the technology up to the point it will become one of their products -and associated services-. It's Microsoft PR 2.0 and sadly... yes, it will succeed.
WSL1, on the other hand, was just a proxy masquerading as a linux kernel, so I would understand confusion between WSL1 and WSL2 on which is real linux and which isn't.
Another way you can tell it's not actually Linux: if something is acting up, where do I inspect the source code? Where can I send a PR if I come up with a fix?
It's smart that MS bought itself a Linux foundation platinum membership; that probably goes a long way in preempting trademark claims of the 'Linux' name.
Op was saying that he/she could not run WSL2 without Microsoft telemetry.
So, yes WSL2 includes Linux plus Microsoft telemetry. Parent was saying that Op actually wants Linux, since Op wants Linux functionality, but not Microsoft telemetry.
Not just that. Privacy, trustworthiness and security aside, let's imagine Microsoft porting (or simply making WSL-aware) some of their software, libraries, system internals hooks, anything that Linux users wanted for ages, gaming libraries etc.), to WSL and not Linux, that is, they require WSL but won't run on Linux, or will run "better" on WSL, or possibly will run on Linux but require some closed or prohibitively licensed code to (properly) run.
Those surely are bleak scenarios, but corporations are there to make profits, and forcing Linux users to require a Microsoft branded layer of software is without any doubt a way to keep them hooked to Windows, which for some will turn into buying products and services from Microsoft, and for others into finding less and less software that will run on plain Linux. I wouldn't be surprised at all if say in two years more books will be published about WSL than about Linux. That would hurt as well since it would mean less courses, less schools adopting Linux, in plain words less users.
What you just said equates to "A Motorcycle with a side-car attached can't be called a motorcycle because to call it a motorcycle implies all motorcycles have side-cars", which makes absolutely no sense.
WSL is Linux, but without X Window or gnome. Very convenient for command line stuff because the native Windows Command line is awkward and unfamiliar to use.
Some things work better in Linux, some things work better in Windows. To pretend otherwise is being a fanatic, in my opinion.
the remaining one is used for Win32 app dev/debugging, however it's on a separate VLAN, with a separate external IP address
I think it's really quite sad that you have to adopt military style separate PCs&networks if you don't want your data harvested
They make you send full diagnostic data because WSL2 is only in the Windows Preview program.
Who really owns your computer then?
Is it you or is it Microsoft?
I wish most applications offered 3 boxes:
1) Don't send telemetry
2) Send data needed to catch bad rollouts (think SRE style status code and latency metrics).
3) Send anonymized data to help improve the product.
4) I want to be a beta tester/insider, you can capture my logs.
It is impossible to get proper usage feedback from your programs without being swayed by the vocal minority community.
We always find posts online on how crappy software is, but how can software improve if the majority of people actually using the software don't give feedback at all?
I’m going to reply strictly to how you’re framing your response.
Most people will not read these documents, so they will assume that they are all the same and that all of them are equal to the worst version you hear of. That’s why “telemetry’s” meaning has changed. You cannot blame the general populace for this. The root cause is how bad acting companies have added language to Privacy Policies such that:
* just about all data == telemetry
* telemetry can be used for “other purposes”
* privacy policies can change without notice
* you cannot opt-out
These are the worst case scenarios. Do not blame people for assuming the worst. Blame those who have changed the rules of the game such that assuming the worst is the default.
Ah, so blame lawyers, product managers, anyone who releases free as in beer software, the open source contributors making all that software everyone else is stealing, the customer’s unrealistic expectation that someone else should pay for things, engineers for lacking the affect to care about any of this or to empathize even one iota with the real economy, product designers for being outwardly countercultural while authoring the literal dark patterns that get the telemetry in the first place...
No. I hope this burns to read.
Software never improves because incompetence is the norm. Not because we didn't have a magical data collection unicorn available.
Competent software companies ran user panels, had decent quality control, didn't steamroll their communities, didn't market loudly over user dissent and certainly didn't shut down their issue tracking to even their top tier partners.
That was Microsoft 10 years ago. That is Microsoft today. But you know, Telemetry solves all these problems doesn't it? No.
The real answer to your question: ask and listen. People will gladly tell you. Do not just take the data otherwise you end up with a set of poorly selected metrics which do not represent user opinion and a lot of pissed off customers who don't want to or can't tell you due to legislation and data protection.
Edit: to back up my point, Microsoft closed down Connect with over 30 issues open from me and our account manager left to go and work for a competitor because he was fed up of dealing with that kind of shit and couldn't even get basic issues from a Gold partner actually escalated to anyone. We had a ticket open for 7 years against clickonce where IE9 broke it completely for about 15,000 users.
As for community steamrolling, this is a repeat of this one again: https://github.com/dotnet/sdk/issues/6145
Edit 2: I have removed some irrelevant stuff. This story goes on forever. I have so many anecdotes from dealing with MSFT pre and post OSS glory that I concentrate all my effort on staying as far away as possible.
Microsoft have a team of people who look at crash reports, and categorise the results (see for example https://devblogs.microsoft.com/oldnewthing/20050412-47/?p=35... , just a quick thing I found).
Having the ability to track the crashes of millions of machines, to find patterns in which drivers are crashing which applications, seems like an impossible thing to replace.
Being unable to opt out and the default being opt-in is what is unacceptable.
Nevertheless, consent is still paramount. Removing consent on the basis that most users are incapable of being informed is a poor excuse.
Most people aren't really stupid, rather bad software make them look stupid and bad tech support shifts the blame to the users.
And likewise simply listening to a vocal minority via "ask and listen" is not a silver bullet.
So, you're both right, you need both to make informed decisions.
I've built and supported software with 80k end users and did that effectively single-handedly.
I'm sure a large corp can do the same if it sacrifices a bit of bottom line...
This sounds great in theory - harder to do in practice. Often what ends up happening is the only people who will share their time with you are the ones who want something specifically changed for them. Thus my point, it's effective, but it's not a silver bullet.
> I've built and supported software with 80k end users and did that effectively single-handedly.
And plenty of businesses have used Google Analytics, Mixpanel, etc. combined with the aforementioned technique.
TL;DR - The two strategies are not mutually exclusive.
And therein lies the root of "telemetry" — the SV bubble's lack of interest, lack of effort, and fear of interfacing with the wetware.
Telemetry is easy. Talking to people is hard. Too bad.
Sure, but lots of things are hard. That doesn't mean we should all be happy about software phoning home without the consent of the user.
The user spent an hour fiddling with settings. Is that because they love the new settings toolbox? Or is it because they were very frustrated with it and couldn't find what they were looking for?
For example, I have an XBox One controller. It used to pair fine via Bluetool. It still pairs fine with my Mac. Other stuff still pairs fine with my PC. But it just won't work after a Windows update.
What is telemetry going to tell them that they don't already know from the forum? Maybe the scope, but it's fuzzy. Some users might give up after one or two tries. Some users might be using the "Add hardware" box several times in a row for different reasons. Telemetry isn't a magic insights thing. It's difficult to get right, and to draw the right conclusions.
One thing's for sure, telemetry's cheaper than QA-ing updates properly.
You could argue that telemetry should then only exist in your beta channel or testing builds, and some developers do that. It's silly to argue everything can be caught by your QA team, that is simply not true for online services. In the past projects I've worked on have had long-standing bugs that took weeks of ongoing effort between both our paid QA staff and customers to finally identify reproduction steps, at which point we were able to examine telemetry for those reproductions and fix the problem.
"This is happening for 100% of users with the B2C3D4 controller and is likely a driver bug, but has happened only twice on the C3D5 device, both for the same user - likely a hardware failure"
What will happen if they won't? Are you going to switch to different Windows or Office?
There are dozens of ways to get feedback from users, but most of them require the company to pay for them. Companies are as bad as your average Joe in this fashion; why pay when they can simply pretend that privacy and data protection laws don't exist and just take?
I said it elsewhere in this topic: My purchase of your software does not give you the right to exfiltrate data from my system. You're welcome to ask for it, or to pay for it, but in no way is it yours to just take.
Additionally, mandatory telemetry was in no part of the purchase process (well, neither were ads). Instead, it's in a completely separate clickwrap 'agreement' (that's subject to change without warning) that's only made clear when you're install the software.
One thing you're omitting: windows xp home was only $199, just like with windows 7. vista seems to be an outlier in terms of prices.
Also, all the prices you've listed are for the full version (ie. not upgrade). The upgrade prices are much lower, and are in line with the current price for windows 10, which does not have separate pricing for upgrade vs full. You can still interpret this as a price drop, but most people get their computers through OEMs, and so aren't paying retail prices. I suspect people who build their own PCs also tend to not buy legitimate licenses. Also, AFAIK the checks for the "upgrade" version aren't particularly rigorous. You could install a pirated copy first, leave it activated, then do a clean install, and it wouldn't complain about the licensing.
"Surveillance" has already been thoroughly scorched by the reaction to the US government's broad violation of the Fourth Amendment. Talk to anyone in the defense industry. Now, rather than internalizing that many people don't want to be spied on, product managers are deciding to double-down on surveillance but use euphemisms. Don't whine when people respond by scorching the new word as well.
They could call it "logging" next year and we'll start tarring and feathering the word logging. The issue is not the word, but the behavior it represents.
Gathering detailed usage behavior of applications must be made optional to the user, as doing so without opt-out is decidedly hostile to the user's privacy.
You say it is impossible to get feedback from programs without a vocal minority being dominant. This is untrue on the surface since providing the option to disable telemetry removes a minority of users, probably a set highly correlated with the vocal minority you are concerned about. So if they have something to let you know, they'll probably contact you directly—the old fashioned way.
As others have pointed out, there's also no compelling evidence that software is better since the advent of widespread telemetry. Telemetry so often lacks context. You don't know what the user was trying to do; only what they did. Just because a feature is used a lot, that doesn't mean it's a good feature. It's merely what users have found in your software that approximately does what they intended. What's unseen, what can't be seen, is intent. You can't (yet, thankfully) measure the reluctance or happiness of the user as they pressed the button.
Even when working ideally, and observing willing users, telemetry has a nasty habit of navigating products to local maxima at the expense more quickly finding significantly better options.
Which almost nobody does, unless they are either completely unsatisfied with the software or completely in love with the software (just like TripAdvisor feedback)
Good point. Better build in a surveillance engine to spy on your users. /s
This seems like an issue with how the product presents that choice or ability to users.
For example about half of the people who make it to the end of my video courses write in and give me feedback based on a 7+ question form I ask them to fill out at the end of the course. I ask very specific questions that could likely be answered in 1 minute or less each. Out of thousands of submissions, a huge majority are positive.
I don't ask for feedback or anything early on, and do my best to avoid giving someone extra "work" to do. I present it in the form of "hey, I see you made it to the end of the course, your feedback is helpful so that my next course is even better aligned with what you want...".
Folks are happy to provide feedback in that case.
I'm one of those people who typically turn telemetry off when I can because in a lot of cases it's not clear on how it'll be used unless I read a 100 page TOS or I simply don't trust the company is telling the truth on how it'll be used. I shouldn't feel like I need to diagnose my network traffic with WireShark just to double check a company isn't harvesting usage stats about an app I'm using.
Because they're not asked.
There's a reason that gathering feedback from IVR and web sites is a multi-million dollar industry. It asks people, and they respond.
There's a sports quote about "You miss 100% of the shots you don't take." The tech industry has to learn that it misses out on 100% of the feedback it doesn't ask for.
Hire people specialised in Q&A, send someone over to big customers to observe how people use your product and ask/pay a customer to interview a random selection of the people using the software.
Or just stalk your customers. It doesn't require human interaction, it's cheap, and probably not illegal enough to actually get fined. Who cares about the actual opinions of your customers when you can just interpret some carefully selected dashboards, right?
Data mining is about using the customer as product, not improving yours.
Telemetry isn’t about better user feedback, it’s about cheaper user feedback, even at the price of quality and ethics.
Additionally, What's to stop Microsoft from turning their Telemetry data into sales or marketing data?
Software, especially Microsoft software became much worse in the past 5 years or so despite their heavy push on telemetry. Therefore I don't think telemetry is a magic bullet that will make software better.
The calculator used to be fast and load instantly, now it's one of those UWP monsters that even asks you to rate it in the Microsoft store...
I don't recall hearing about updates bricking machines or causing data loss at scale back in the Windows 7 days but it seems like that is now a relatively common occurrence, amplified by the fact that you can no longer hide/defer updates on consumer versions of Windows. I think the firing of their QA team and delegating the work to unpaid "insiders" and telemetry might have something to do with this.
The new Settings UI is absolutely disgusting both in looks and information density and is a clear downgrade from the previous version.
I can go on and on. I would sympathize if they were pushing the boundaries of software engineering but what we're talking about isn't groundbreaking - these are problems that were mostly solved a decade ago and Microsoft intentionally backtracked on their progress by the looks of it.
This could also be explained by user expectations for software rising but quality of Microsoft code remaining constant. In the past users may have written off such events as 'just the way computers work sometimes' but perhaps now users realize that computers needn't be so unreliable.
I disagree. Evidence that supports MS code quality dropping includes a significant amount of users hanging on to Windows 7 with their cold dead hands even post years of MS marketing, arm twisting, GWX updates, and EOLing Windows 7, with users paying for ESUs via Ask Woody vendors and/or that 0patch tool.
I myself moved to Windows 8.1 and from there am hem-hawing on whether to use KDE Neon or Linux mint XFCE and just leaving behind Windows except for the air-gapped Windows 7 VM I will no doubt need for things like Anime Studio. I will not allow Windows 10 (outside work devices) on my home network.
(Maybe for Centaurus aka Courier Jr...but I'll put it on the guest wifi and make a bunch of throwaway accounts for it. )
How long does it take after boot for disk I/O to stop if you've got a 5400 rpm hard drive? It's maybe a few minutes after login on 7, and I've never seen it stop on 10.
Why does the calculator take seconds to start? Why does it ignore keypresses when it has focus?
How many clicks does it take to set my IP on an isolated network with no DHCPd? How many different contol interfaces will I see on the way?
On Old Edge (I haven't tried Chrome based Edge), why does the stop button sometimes not stop until the page finishes loading over several seconds? Why do the back and forward, and url navigation interactions queue up in that case?
>How long does it take after boot for disk I/O to stop if you've got a 5400 rpm hard drive? It's maybe a few minutes after login on 7, and I've never seen it stop on 10.
Unsure what you are saying. Windows is the only OS to write to disk after boot? I don't see anything hitting the disk after a few seconds.
>Why does the calculator take seconds to start?
>Why does it ignore keypresses when it has focus?
unable to reproduce
>How many clicks does it take to set my IP on an isolated network with no DHCPd
>On Old Edge
I don't use that particular piece of software so I can't tell you.
It's "instant" to show up, but you got to wait one to two seconds for the splash screen to disappear, whereas the windows 7 showed up instantly and was usable immediately.
- Uncontrollable automatic updates on Windows 10
- Forced reboots for updates even if the computer is in the middle of a long-running task
- Ruining of Windows search to the point it is basically unusable
- Removal of Paint and deprecation of the Snipping Tool
You know, just to name a few.
Oh God don't get me started. Few days ago I was in the middle of a videoconference with some important people, when suddenly my screen went blue, flashed a Windows spinner and the word "Restarting", and boom. It just rebooted. With no warning, despite me being on a videocall, on Microsoft Teams of all things! And within the "active hours". How this behavior is acceptable is beyond me.
Corporate-forced updates, however...
Why? Because I set my dual-GPU gaming rig to sleep every night, and Windows feels the need to wake it up about an hour later, fail to do updates, and then leave the machine on the rest of the night, even with auto-sleep set for 30 minutes.
I noticed the problem a few weeks later, and by then my electricity bill had increased by 20 bucks.
for example, i have seen videos of ms word and ms visual studio, on old pentium, load instantly with a splashscreen flashing by. i was truly impressed indeed.
For example, Word 6 on my 386DX with 4 MB RAM took some time to launch and I had ample time to admire the art that went into the splashscreen. On 486DX with 16 MB RAM, it did flash by.
Of course what constitutes an improvement or regression may vary by person
Old Microsoft software had a simple toggle switch for this.
By default, windows 10 lets Microsoft engineers remotely log into your box and browse your filesystem. They say they only use it for diagnostic purposes, but I don’t see how that could be true unless they’re in violation of US law, which compels them to give the same access to law enforcement.
I’m not sure if you can opt out of that (or whether the opt out would survive a warrant).
I switched away from windows over this sort of thing. There were dozens of other objectionable things they were caught doing, and efforts to build windows 10 “decrappifiers” made it clear they were adding new telemetry every month, and laundering the data through sock puppet domains.
I'd like to see a reputable source for that claim.
> Full: All data necessary to identify and help to fix problems, plus data from the Security, Basic, and Enhanced levels.
In the Windows 8 days, they claimed that engineers couldn’t silently pull individual files from machines without managerial approval. I can’t find the source. It was some old news article with an interview with a Microsoft manager.
Anyway, “All data necessary to identify and help to fix problems” pretty clearly implies they can pull whatever they want as they debug. I don’t see how they could implement that without exposing customers to warrant requests.
This page outlines everything additional they recieve on the Full setting.
> In the Windows 8 days, they claimed that engineers couldn’t silently pull individual files from machines without managerial approval. I can’t find the source. It was some old news article with an interview with a Microsoft manager.
I recall reading something similar, but for Windows 10. AFAIK it said that engineers diagnosing a difficult problem can select a group of machines to receive raw telemetry from, after getting permission from managers + microsoft's privacy team. I have a feeling it was for insider builds only though.
The GP comment seemed to imply that Microsoft engineers could log in remotely without your knowledge or consent.
This is correct, they AFAIK need a password/acceptance from the user, that's the proviso, but the original comment didn't say "without anyone knowing" (and as it's closed source none of us knows for sure). Their quoted claim is true it's just of very limited value.
This whole thread is going nowhere.
The first question should've been "yes, but can they do it without a password or user-acceptance". The answer is "we don't know" AFAIAA.
And I would say most software today is incredibly buggy. Almost every major piece of software I use now from large, well known companies is just rife with bugs.
You're right, it has been bastardized.
We used to call it what it actually is: spyware.
Usage data is a better name for what gets returned in most telemetry.
I would much rather genuine telemetry is supported so it can be reported in a way that doesn't allow a "you agree to this" hook to be used for both innocent telemetry and problematic telemetry.
Throwing your toys out the pram at any usage data going back is harmful not helpful. It will mean that the bad actors will win because they'll be the only ones who have the data to improve their products.
Or everything will go saas so you'll get desktop "software" that's nothing more than a shell making HTTP calls back to a backend so all the usage gets tracked there and it'll be slow as shit for the priviledge.
I don't know where you heard that but that is absolutely untrue. There have been anti=spyware apps dating back to the late 90s and before, and they essentially AV scanners with a slightly different focus
The New Oxford American Dictionary defines it as "software that enables a user to obtain covert information about another's computer activities by transmitting data covertly from their hard drive"
Without consent / explicit opt-in, this describes most telemetry perfectly.
There are tons of other products in this world that don’t rely on surveilling on their users’ every single move to solve these problems and improve the product.
Besides, there is a huge assumption that the data gathered will be used only to improve the value of the product for user. Considering the fact that the two sides of the market, the buyer and the seller, is in adverserial dynamics when deciding the price point, it is irrational for the seller to not use this information to actually increase their profitability. They might as well, and indeed do, use telemetry to cut their costs without moving their price, for example in the form of cutting support from existing but under-demanded features, allocating resources that could bug fix an existing feature to features of the next product, to engage in extractive behavior such as upselling new products etc. All of this shortchanges the end user.
Some problems get reported for years without any fix in sight.
Like, stop the telemetry nonsense and read the darned forums, starting with your own?
1. well-behaved programs that do what I tell them to do,
2. programs that surreptitiously send my data to an external party without being told to.
I thought we were calling the second category of behaviour "telemetry" (or "spying" when we're not being polite). If not, what is the correct terminology?
Those users participate after providing consent, and are paid for the feedback they offer.
Your comment could be seen to imply that software developers are entitled to behaviour data from users and organizations without their consent (or even awareness).
The best way to understand and improve a product if you lack the ability to gather telemetry or user feedback is to use it yourself and identify areas for improvement.
Needless to say, authors weren't thrilled about it
You won't hear about it because nobody will respect that setting in any meaningful way, so everyone will disrespect it.
This is how it was done before telemetry was possible. Good software was made back then. All our software today is built with or on top of software that was developed without any telemetry.
Today it's much more common to see companies / product managers using telemetry, instead of their brains, and making bad decisions as a result. There are always confounding factors, and they usually dominate. Collecting numbers is easy, collecting the right numbers is hard, product teams don't have time for that. Telemetry ends up mostly being used for excuses for bad decisions.
Is there a particular company size when you suddenly can't collect telemetry because of the privacy implications? Why aren't they allowed to compete using the same tools as everyone else?
Microsoft's telemetry practices SHOULD DEFINITELY be more heavily scrutinized than other companies.
Your website uses Fathom which focuses on privacy but doesn't have any third party auditing of that claim (searched their site for "audit", no results). Why do you trust them to do the right thing with my browsing data when I visit your website? What are you using the data for, to improve the site and create content more people are interested in? Why can't Microsoft do that do to improve their applications?
I don't say all of this to downplay the importance of knowing that things are collecting telemetry and shipping it off somewhere but we can't just have a blanket statement of "you can't do it once you're big enough to abuse it" be our guiding policy either.
The problem is on the business side of software.
I'm seeing a lot of pearl clutching.
As it is now, they don't deserve it and the dark patterns being deployed ought to be criminal.
It is a word that has been cleaned up and packaged to minimize discussion and confrontation.
There are plenty of words like this.
Even "justice" or "freedom" are ambiguous and allow multiple meanings. Two people can talk about justice without actually meaning the same thing.
One wonders how we ever came this far without looking over the shoulders of users.
If they want to understand how people use their products they can perform usability research. If they want me to participate in such research, they can offer to pay me for my time. Snooping on me with embedded spyware is not acceptable.
1. Let's collect information on every URL, including every distinct pornhub URL, a user visits.
2. Let's collect information on how many times a user browses pornhub.
3. Let's collect information about how and when a pornhub site crashes our browser.
4. Let's collect information about how and when our browser crashes - without submitting the website that crashed it, just the internal info such as call stacks.
Only 3 and 4 are are about fixing bugs, and while 3 would maybe make it easier to reproduce issues by knowing the exact website that triggers faulty behavior, you end up with a lot of information about your individual users that could be abused.
1 and 2 are about potentially making your product better by being able to tell what websites (or features, etc) users are actually using and focusing on improving those features, or figuring out why they do not use the other great features (e.g. bad UX?). However, this comes at the great expense for user privacy.
What really makes me mad about microsoft is how little they are telling you what kind of telemetry they actually collect at what configurable level. I do not know if they do 1 or 2 or 3 or 4 or any combination thereof by reading their privacy notice. The privacy notice says (or used to say?) that they can even transmit your files around for telemetry/bug fixing purposes if they feel like it. And they are very unclear about how the data is processed and retained and how long such a data retention is.
I'd be happy to contribute some telemetry depending on what it is. But they are refusing me configure let alone tell what they collect, why they collect data and if they got proper processes in place before they add certain types of data collection, so in this scenario I'd like to opt out completely. But I cannot because not an option. And I genuinely hope they get slapped with enormous GDPR fines for it.
As a counterpoint, Firefox - while not perfect
- There is an opt-out/opt-in for telemetry/crash reporting
- They have a privacy document describing what they collect and when: https://www.mozilla.org/en-US/privacy/firefox/
- They also openly describe their process, which makes me more confident in them talking this seriously:
- They show me what telemetry exactly they collect: about:telemetry
(While it's certainly hard to figure what all that data means, it being there in the first place means that third party experts can easily access it and evaluate it)
This telemetry is mandatory. Users are not permitted to opt-out.
This means that the average Joe must understand what information is being sent and the submission must be opt-out rather than opt-in.
It's possible to do telemetry in a way that does not violate this law, but that means you're not allowed to do more than basic aggregates that can just as easily be collected on the backend. A collection of installed software can easily act as a globally unique identifier because every PC installation is different, so even training a "recommended for you" system that just finds other software that people commonly install along with a certain package must already be opt-in.
The developers add insult to injury by also lying about the theoretical ability to opt out.