Hacker News new | comments | ask | show | jobs | submit login

According to the FAQ, "It’s also possible that Google does not use student data for any of these purposes—but unfortunately, Google has refused to articulate the reasons" so it seems like the EFF's position on Internet-hosted applications is that the specific uses of each kind of data should be described in a privacy policy.

I think it's a tenable but extreme position, because basically they are objecting to Google reserving the right to develop new features in an empirical/data-driven way.

I think most people don't think of e.g., their privacy w/r/t tax data being compromised when their tax prep software company mines it to make data entry simpler, or to make it easier to understand the consequences of various filing choices by visualization, etc. Similarly, I don't think Google is invading my privacy when it takes my search queries and uses not only to produce SERPs for me but also to notice that when people type cyombinator it is likely a typo for ycombinator.




I don't think it's asking much for Google to put a wall around the profiles and associated usage data for these school related accounts.


As I said, I think reasonable people can disagree about it, but I think they are demanding that the wall have a shape like "this data will be displayed back to you while you work on it and to your teacher during the grading period and otherwise it will not be used," whereas many people are totally happy for the wall to be "we will use this data to help you learn and allow your teacher to evaluate your progress." Lots of people object to advertising to kids, not so many people object to building a better word processor or kids-oriented theorem prover or whatever, including ideas that haven't been thought about yet but are (in the developer's honest opinion) wholly motivated by the purpose of education.

EFF basically doesn't trust a cloud software company to have any discretion, but I think most people are willing to take an informed risk when they entrust their data to someone else's instructions or computational resources; otherwise they'd write the software themselves and run it on their own device and so forth.


Your example seems fine to me. My kids use Google Docs for school, and use it both for collaborating with other students at home, and to submit homework to the teacher. That seems to fall in your example.

The risks that you describe also sound reasonable for adults to assume. However, Google, being who/what they are (organizing the world's information), shouldn't be surprised when "watchdogs" ask more of them, particularly in this case. They deliberately entered this education space. It's not just about plain advertising (to me). Whatever profiles are built from children's use of Google's services (that they are required to use through school), should be carved out from their normal data harvestinf and user profile nurturing. It should be up to Google to develop a "win/win"model whereby they can protect students and properly monitor app performance. It doesn't sound too challenging for a company like Google.


The problem I have is Google is doing this A/B testing on their public apps in the wild. I don't know if I'm getting the best interface at this moment or am being a guinea pig for some experiment. And not only UI experiments. Remember when Facebook reordered timelines to measure the affect on the reader's mood?

I think experiments should be done in controlled conditions with mock data and informed participants. When you then sell something to the public I expect it to be a finished and stable product. I can then build my workflow around your product and know that it won't be made obsolete overnight because you applied some enhancement in a patch.

When Google used to explicitly mark their apps as "beta" we'd joke about them being in eternal beta mode. It really looks like that's not a joke.


> The problem I have is Google is doing this A/B testing on their public apps in the wild. I don't know if I'm getting the best interface at this moment or am being a guinea pig for some experiment.

My problem with that reasoning is that you're assuming that there is a best interface and that what it is is already known. A/B testing is usually done to determine which choice is the best, how will they know if they don't test? Also who decides what is best? Does best mean that it's incredibly easy to use for 90% of tasks but terribly hard for the other 10% for some reason, or that it's moderately easy to use for 100% of tasks?

> I think experiments should be done in controlled conditions with mock data and informed participants.

The problem with this is that the mock data won't get you applicable results in a lot of cases. Along with that, an informed participant affects the selection further reducing how well the results reflect the real population of users. You've also likely agreed to this kind of testing in the TOS, so you might be considered to be the informed participant that you mention.


Just because your experiment is difficult doesn't mean it's ethical to use other people's data without their informed consent.


Businesses have experimented with "other people's data" forever. The clerk at the video store chats with customers, learns which ones are chatty, what recommendations work out, etc. This is considered to be good service. But a cloud software company does the same thing with an algorithm and a CPU and suddenly it's an outrageous violation of privacy. Suddenly we need informed consent protocols to change the signage on the store front or the font on the web page. Seems like an overreaction to me, assuming that all cloud companies are out to take zero-sum advantage of you.


Do you seriously not see a difference between the video store clerk recording people who voluntarily information and personal data such as browsing history?

Nobody cares about Amazon using their own sales records and server logs to generate recommendations. The problem comes when technology companies decide that means they get to use any data.

If you're fine with google using personal data, are you also fine with FedEx opening up every package they deliver to you?

> take zero-sum advantage of you

Nobody said "all" or "zero sum", which doesn't apply here, but "take advantage of you:" is pretty much a description of capitalism. On HN, this is usually called "monetizing".


Why is the boundary of a firm relevant? If Amazon buys a shoe store, does that make them more legitimate in their use of your shoe shoppng data than before? This seems like an arbitary choice that is biased in favor of big companies.


Incorporation isn't the boundary. Again, why are you conflating business data with the personal data of someone using a product that has nothing whatsoever to do with the business transaction.

If Amazon buys a shoe store, they get the sales records and any other related data. They do not get to know where you walk with their shoes.

If there is any confusion here, it is because of the recent trend in Services as a Software Substitute that makes the business's server necessary for normal use of their product. Some people seem to think this lets them open the packages they are conveying or storing.


Do you think fedex doesn't run analytics capable of telling them, based on parameters of your packages, some pretty deep things about you?


Of course they do. Traffic analysis is always a problem. They obviously know the src/dst addresses and the package dimensions (including weight), as those features are necessary for the the delivery.

They still don't know what's inside the package. I really don't see why this boundary is hard to understand. With snail-mail (fedex, usps, etc) there are even laws that protect the boundary between the envelope and the private contents. Why would you think software would be different?


I think it's a tenable but extreme position, because basically they are objecting to Google reserving the right to develop new features in an empirical/data-driven way.

They can develop new features with their data.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: