That was fast. I was trying to hold out hope that muse would be ok and the aquisition wouldn't change anything, but they wouldn't even let me have that.
Not even a week. The assholes couldn't wait ONE WEEK before trying to spy on all their users. Is this a record for losing the trust of an established user base?
I think the worst is yet to come. I'd venture most users don't even know this happened yet.
We can thank developers for the widespread usage of "telemetry" in apps. (Even the word "telemetry" is designed to mask it's real meaning: tracking. Would any developer use the more honest word? Of course not.)
If you want to learn how your users use your software - simply ask them. There are lots of cheap research methods from the usability field.
Expect more telemetry in desktop apps, particularly with the rise of Electron.
Developers don't oppose these developments, they are often the ones who initiate these tracking features in the first place.
Witness the popularity of Electron-based Visual Studio Code with built-in telemetry. Most developers could not care less.
> If you want to learn how your users use your software - simply ask them
And thus lose input from most of your users. This then skews your decisions towards the slim fraction of users that can be bothered to fill in surveys and such, and eventually you lose the other users.
(Opt-in) telemetry is totally fine, IMO. It would be preferable that companies self-host the backend though.
honest question; as someone who is currently building an audio application which has telemetry, how else would I discover bugs or problems that might be affecting my users, if I can't upload crash reports, etc. ? I mean, I'll do my best to test before releasing, but when a user contacts support, describing some weird behaviour, without telemetry I have zero information to work from, and basically have to resort to "stacktrace or GTFO"... wouldn't it be nice if that stacktrace was anonymously uploaded and I had some idea of how prevalent the issue was, and maybe there's some information in there to help me fix the problem, and make my users happy?
My personal take is that this type of telemetry is fine. I just don't want to give potentially unbounded, hard-to-audit access to my telemetry data over to the likes of Google. If your application provides a means of learning _what_ is getting sent, and you make it opt-in, and you only send it to infrastructure you completely control (within reason, I'm not going to complain if it's a VPS or something), then there shouldn't really be any controversy.
I don't hate Google. I just don't want my desktop applications to even be capable of telling Google about my computer usage if I can at all help it.
Save the past n crash logs on the local machine for the user to inspect, and allow the user to turn it off/clear them. When there's a crash, offer to send the most recent crash log.
Similarly, use a double-file-buffered stack trace of the last n minutes of function calls. If the user selects "I just experienced a bug" in the Help menu, offer to send the most recent stack traces.
I haven't managed a big project before, but is spying on your users really the only way to handle bug monitoring? Couldn't people just...report bugs when they happen? Submit logs? You know, voluntarily? Hey, you could even have a version with telemetry, quarantined off by itself with a bunch of warnings that people can download and run if they want to help, similar to non-release builds. But forcing it on everyone, even if it's "off by default," is no good.
Essentially, it’s to help us to identify product issues early"
"We will consider replacing Google and Yandex with another service if we find one that fulfills our requirements - thanks for the suggestions and keep them coming."
"Just to reiterate, telemetry is completely optional and disabled by default. We will try to make it as clear as possible exactly what data is collected if the user chooses to opt-in and enable telemetry. We will consider adding the fine-grained controls that some of you have asked for."
In other words "We're sorry you don't like what we're doing, but there's nothing wrong with it and we're going to keep doing it." Wow. I don't want fine-grained control, I want it out of the code.
This seems universally hated. I am usually pretty privacy concious and do a lot of stuff that I don't want to be seen by the authorities / competitors / my family / whoever depending on context. But honestly I don't really understand the issue with telemetry. What is your concrete worry, or your threat model?
A) I can imagine somebody you know works at Google, and tries to stalk you. But they'd need to know your IP address, and I assume Google has protections in place against accessing consumer data, and it would leave an audit trail. And in the end, they would only be able to see when you edit audio files and whether you use the toolbar or keyboard shortcuts.
B) Or maybe they feed that data into a super scary tracking algorithm and use it to better target ads? But I don't know how that would even work, as they would have to understand what Audacity's tracking categories mean.
I know it is a controversal opinion, but I think telemetry is usually a net win. Especially if you want to improve the UX of your open source app, you need actual usage data. And unfortunately, opt-in doesn't work, because you get skewed data. Why would anybody opt in? There's no benefit to me to take part, but there is benefit to all if everybody takes part.
Microsoft thought (in some HN thread a while back) the Terminal data collection was perfectly safe because it was only stored locally, but it turned out they were accidentally sending it to their servers.
Lots of companies have created bugs that accidentally erased the wrong files (e.g. Apple erasing the whole disk when it had a space in its name)
so it's plausible that they could accidentally upload the wrong files.
If collecting user data is prioritized above protecting user privacy, it's more likely that this kind of mistake will occur.
2) Anonymizing data is tricky. It often turns out not to be as anonymous as was claimed or assumed.
If this is not considered a particulary important issue, mistakes are virtually guaranteed to happen.
3) As you said, there is no benefit for the user, so they gain nothing in return for the unnecessary risks introduced by this anti-feature.
In my use case as a grad student in phonology and a linguist interested inter alia also in sociophonetics, de-anonymised data could get me in big, career ending trouble. Imagine me studying some minority's language use, even if the revealed data does not put them at a huge risk, I'll get in trouble with the ethics committee. And it might be possible to de-anonymise participants using e.g. file name, location through IP, etc. and get the participants themselves in trouble. Essentially this merging means Audacity for me is categorised as "can't be used in research", as one of the main reasons the rely on FOSS in research is that this shit don't happen. Luckily it's not like this is a web browser where I can't record if this one obscure standard is met, I just record in Praat or Ardour maybe and call it a day.
That's apart from the many other moral issues, of course, like this whole grabbing the work of FOSS community with money thing is pretty disturbing, which inevitably leads me towards using ethical source licences in any programming I might do such that this sort of stuff doesn't happen.
Also the elephant in the room of course is this is opt-in ___for now___. Recall the days Firefox wasn't plagued with privacy blunders and studies and Google money? There's no reason to believe that Audacity won't end up going the way of Mozilla too.
Telemetry is why the Firefox UI keeps getting worse.
Also I don't want to deal with an audio tool connecting out over the internet. They can listen to what their users say on an issue tracker and write regression tests. Hell if they need help writing regression tests I'll write some for them, that seems to be what caused the problem.
"Opt-in won't work, because most people won't opt in." Ask yourself why this might be, and you have your answer. Also ask yourself how you, or more accurately the devs here are ethically justified in forcing this on people when they don't want it.
You are giving your behavior data for free. It's a win for the receiver, because they don't usually provide the aggregated data back. Sure, you're also getting a product for "free", but this kind of transaction (data for product) is not required in free software. The philosophy is completely different. And let's face it, the data they collect have intrinsic value in aggregate. I'm not saying they're going to use it for nefarious purposes, but it'll enrich them in a way I'm not sure is completely ethical by my standards (hoarding behavior data, surveillance). Phrased in yet another way, you're giving them an edge to transform into personal data business. This may be unanticipated, but so long as the collected data will be their private property, it's real possibility.
The deal is not that I get the product in exchange for data.
The deal is that I get a better product, if everybody shares telemetry. (But in fact I don't get incremental benefit by sharing my personal telemetry.)
There are several applications I use daily where I am sure they would benefit by using some kind of telemetry. We have one business application (from a big vendor) where I accidentially learned that salespeople need hours to create invoices because of a recurring crash. Finding this, then learning to reproduce, write a bug report, go back and forth, takes days, whereas this particular bug would show up easily on telemetry.
An application I developed worked fine during testing but behaved strange in the factory where it was used... it turns out that while waiting the user tends to press random things on the touchscreen, and that uncovered a bug. Without proper logging this kind of thing is really difficult to catch.
Mostly I think it's a smoke screen to collect data for advertising (and hey, if an alphabet agency wants in, they can always join the party too!) There are a lot of people that seem to buy into "Telemetry is the only way to find bugs!" though.
Just a reminder: This is coming from the same company that made a bunch of publicly available sheet music hard to access. I'm sure they'll have a number of "interesting" PRs in the future.
dumb question; can someone explain to me how a private company (muse group) "acquires" an open source project with lots of contributors? Who "sells" it? Who receives payment (if anyone?) Who decides the IP is now Muse Group's property?
> Why have telemetry at all? Essentially, it’s to help us to identify product issues early:
> Audacity is widely used across several platforms, but we have no information on the application stability.
Really? You have no information about this? Don't you think instability would be reported by some users?
> It is difficult for us to estimate the size of the user base accurately.
You can obtain an estimate by tracking the number of downloads along with some far less intrusive metrics like user agent, etc.
> We need a way to make informed decisions about which OS versions to support. For example, can we raise the minimum version of the macOS to 10.10 to update the wxWidgets to the latest version?
Generally speaking no you should not raise the minimum OS version on a ubiquitous piece of software like this. Either work with the library vendor to find a way to support old and new, or use a different library.
At a minimum keep the old versions around, there are no doubt offline installations out there running on old kit. Don't force everyone to upgrade their hardware/OS (it's expensive, and with apple products also means usable hardware going to the landfill)
> We have a known issue with the new file format introduced in Audacity 3.0. We found it with the great help of the community members on our forum. However, there is no way for us to estimate the impact of these issues on users. Is it just a random case? Do we need to rush the work on the recovery tool or help the users one by one? Or do we need to rethink the file format to make it safer and more easily recoverable?
Again, use the estimate metrics from downloads and user agents. And in any event you should prioritize fixes by the severity of the actual bug, not by "how many people does it affect". Just focus on writing stable software.
Because we don't need any experience with any software to say that telemetry, any kind of
it, is wrong; it has already been abused and will be abused.
Long gone are the days of a program that bombed, so one would send the core dump containing a great deal of personal information and sensitive data to the developers who would investigate the bug and mail back an uuencoded solution. Sounds old? Yeah because it is.
This is now a connected world in which every hole can be exploited to make money or gain advantages over other people, software that phones home -no matter if in perfect good faith or not- can be exploited to do nasty things. Thumbs down.
Because a huge part of the tech community lives in a kind of "outrage mania" at this point and is just looking for the next shitstorm to post in.
This Github thread has literal calls for violence, racism and other such things in it and people keep piling on more. Tomorrow we'll have a different thread.
> I've been watching this thing from the start. I've not seen a lot of that.
The worst comments are gone. I reported some to Github and they removed the responsible accounts.
> People are outraged at being tracked
People are outraged at opt-in functionality. Whether it should exist is worth discussing (and, of course, how it should be implemented) - but lets not pretend that there is some evil move to "steal" your data going on here.
Doing what exactly? Adding opt-in (why do people ignore this part?) functionality for tracking how their software is used, to figure out how to prioritise? I just don't see how people read evil conspiracy stuff into that.
Searching for what this company supposedly did wrong brings up a bunch of terrible articles which literally quote stuff like 4chan greentext with fictional (and implausible) scenarios for how this can be abused.
Is this tracking going to be useful for their purposes? No idea. Should they self-host this instead? Maybe, if that's feasible.
https://old.reddit.com/r/programming/comments/n6kxm8/psa_aud...
Basically, "everything's fine, you can trust us".