Hacker Newsnew | comments | show | ask | jobs | submit login

Really? Nothing good? From any algorithm at all? How can you be so sure? Amazon seems to do a pretty good job thinking up new stuff I might like with its algorithms, for one.



Try an Amazon account which is shared with family members.

Amazon thinks I like electronics, Hello Kitty and bows.

-----


I've got an arduino-powered robotic bow-gun firing Hello Kitty droid-doll you might be interested in...

-----


Well, that's hardly Amazon's fault, it is?

-----


Never mind that one account used for shopping for a family is a fairly common use case...

-----


Do you want them to start reading your mind, or stop suggesting stuff altogether?

-----


As suggested by another commenter, that's a false dichotomy. I'd argue there's lots of room for innovation and refinement here.

Manual or algorithmic identification of family members is one option, but I think there's something more to be gained here by understanding the behavior just a bit more abstractly... Shopping moods? Modes? Targets? Tasks? Something, possibly – but I'm only guessing here. The real power could come from actually working with/doing user research and prototyping.

I'd love the chance to work on such a project, actually – I had a chance to work with data from a study on the online shopping habits of mothers a few years ago and there's lots of interesting angles to possibly explore, IMHO.

-----


They could realize those things are not usually liked by the same person, ask a user if this is a family account and let you define family members and their preferences within that account.

-----


Honestly, I have to be nervous logging into my personal account with other people standing around. The "Recently Viewed" and "Recommended for You" are full of things I'd really rather my family members and most of my friends see.

-----


I suspect you meant "not see".

That said, you can manage that at http://www.amazon.com/gp/history

-----


I don't think he's saying that at all. The insanity is trying to construct a platform that manifests a person's entire social life. I think there's a harsh combination of conflating business interests and disinterested, contrived people sorting that is potentially very, very bad.

-----


I'm not sure I follow you. Right now, Facebook knows roughly this much about me:

1) All of the people I have legitimated as Friends. 2) Screeds and screeds of data about the quality of all my connections with those friends (have I hid them from my news-feed? how much do I interact with their posts? how much do I interact with them? talk to them? share with them? stalk them? how much time do I spend in interactions with them? how do the patterns of my interactions change with them (mouse-click patterns, scroll patterns, typing patterns)? do I use different words (different emotions/more formal/informal language/different function word patterns?) 3) People I've stalked but haven't friended 4) Brands, companies, causes, ideas, things from history, all sorts of random shit I've liked (along with similar data to 2)) 5) What sorts of things I click on, how I leave the site and where I go. And certainly more.

Etc., etc., etc. Facebook didn't manifest my entire social life. They just built a platform which allowed me to socialise orders of magnitude more efficiently than before, a platform of great benefit to me, and one into which I've poured vast quantities of data (and the amount, quality and variety of the data is only set to increase as we open up more input modalities - eg eye-tracking, gesture-tracking, and biometrics). the data is all there. All facebook is doing is cobbling together some of the most very basic tools one might use to make sense of a subset of that data.

-----


Do any of your family members under 12 have facebook? Imagine the aggregate of all their experiences in a decade or more.

-----


My little brother and sister had it from 11 and 12 respectively. Yes, it's quite interesting to think about.

Anyway, what's your point? By the way, you're discussing this with someone who believes that in the future we will record and utilise orders of magnitude more amount of data about our lives and our planet than we do now, enough to make your eyes water, and furthermore, that we are inexorably fated to do so, and furthermore, that it's a really, really good trend, one which is going to play out in highly unpredictable and surprising ways.

-----


So you're a fan of data, that's fair enough. Who isn't a fan of data? The issue is, who owns and controls that data? You, the true owner and source of the data, or some big advertising-driven tech giant with it's own crappy hollywood movie?

Not only that, but in order to make money from advertising, they link all this wonderful data (your data) with advertiser interests. Much of this linking is done under the radar, via rules you are not allowed to see.

It's very important to distinguish the benefits of shared data with ownership and control of the data, and the rules by which it is mined and accessed.

BTW, a lot of Facebook data is "self-expression" and if given the choice in real-time, many people would elect for NOT for their contribution to go into the tech-giants mainstream database. Going into "settings" and messing around with broader privacy options for content types and particular people is an absolute joke in terms of UX and human experience in relation to expressing personal views or communicating with friends.

"Privacy settings" does not come naturally in communication. Facebook knows this, and knows people will not bother or become lazy with privacy.

Bottom line is, Facebook is an inappropriate platform for the collected data that is your life.

-----


What, in your opinion, would be an appropriate platform?

-----


A platform which is transparent, open and shared by it's nature. A platform for which the first and foremost priority is to be trustworthy with no conflict of interest between being all the above and economically feasible. This would probably become a government-funded international push rather than a single corporation with commercial interests.

The big problem in the future will be the fact that as the line between "real" and "online" life diminishes, the party which holds the data will become an authority akin to governments we have today. When or why should we trust an authority? How can we be sure that the authority is trustworthy? Politics is hard enough and we all know that we can't trust politicians, how on earth are we going to trust a party with commercial interests to somehow manage our social lifes?

You seem very optimistic about things like Facebook collecting data and interlinking people's social life via their platform by using the collected data. I for one find this very scary, to such extent that I'll rather cripple my social life and not use Facebook than trust an commercial authority over the data collected about me which I have no control over.

I provide the data. I see the data. I own the data. I control the data. What is so hard about it? Honestly.

-----


I suppose I've always assumed that no company can escape government and therefore democratic scrutiny in the long run. Maybe the situation is different in countries with more corporate corruption.

Also, not using facebook is difficult to imagine for me. It's pretty baked in to my life.

-----


What zxcdw said. :-) Also, my previous post has typo, should read "IMHO FB is not currently an appropriate platform"... damn what a bad typo,.

The thing is, Facebook could be better by allowing members to invite outside data to flow into their feeds from chosen sources, and providing more freedom with data-exchange in general. Enforced segregation is costly.

-----


Data doesn't need to be implicitly tied with self-expression.

-----


...

Dude, what does that even mean?

-----




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: