Hacker Newsnew | comments | show | ask | jobs | submit | thebigpicture's comments login

Is "responsiveness" the same as "responsive design"?

-----


I don't think it is. "Responsiveness" in this article's context is about how responsive the page is, as in how fast it renders and shown to the user.

"Responsive design", OTOH, is about a page that adapats to the user's screen resolution; eg: renders differently on 480x320 vs 1024x768, on the same code base. HTH.

-----


But if you construct the chest so no one can verify what's used to make it, what's the point?

If Apple's products are quality all the way through, why can't I easily open up my iPod/iPhone/iPad?

If it's quality all the way through, why would you hide what's on the inside?

-----


Part of it's high quality polish and finish is the lack of seams and protrusions and screws. This necessarily means something that's difficult to open up.

You can make an argument that they made the wrong trade-off between aesthetics and pragamatism, but for the trade-off they made, they executed well.

-----


In some cases the 3d party stuff might be better than what Apple sells. Apple is a lost cause.

-----


It must be great to have such naive clients.

-----


Thank you MS Research for a dose of sanity. "Big data" seems very potent as far as marketing buzzwords go. It plays on people's ignorance and the general sentiment of "too much information".

I'll be keeping this pdf in my "rebuttals to idiocy" folder.

There are some industries that certainly have do have "big data" (Wikipedia has some definitions for "big data" that include size ranges for whatever that's worth) but it does not seem like companies with "big data" are the only targets of "big data" marketing. And from what I know about available solutions, if I really had a "big data" problem (e.g., 100 terabytes not 100 gigabytes) then I would not be choosing Hadoop. I also would not choose SQL or "NoSQL". But that's just me. Some of the best solutions I've found have nearly zero marketing. Go figure.

-----


fmt 1 < file seems like it might be handy

i don't understand his first one though. i thought both linux and bsd greps had the -H option (show filename).

here's some more:

   1. sed t file instead of cat file

   2. echo dir/*|tr '\040' '\012' instead of find dir

   3. echo dir/*|tr '\040' '\012' |sed 's/.*/program & |program2 /' > file; . file instead of xargs or find
(of course this assumes you keep filenames with spaces or other dangerous characters off your system.)

   4. same as 3. but use split to split file into parts.  then execute each part on a different cpu.

-----


The point of the first one is he's abusing grep as cat; what he really wants is a cat that shows filenames, but since there's no such thing he uses "grep ." as a substitute.

-----


Maybe he just meant from the user's perspective. If someone just wants to able to call their friends and family, they do not need 32 million users to switch over to anything. What they need are simple instructions and the code to set up the network and make those calls.

The people who brought you your Skype were a bit "bold" don't you think? Shouldn't they have just left VOIP to the telcos? The telcos have 100's of millions of customers. How many did Skype have when it started?

Is this a startup forum or an "I love the status quo" forum?

-----


I didn't mean to be discouraging, sorry about that. It's just that I think "replacing Skype" is more about non-technical things than setting up a SIP server. I used to be involved with this industry back in the day of SER 0.x. It was pretty hard to get people to switch to use anything new, but Skype did that. They were just bringing out the Skype USB handsets and other bundle deals in my country back then.

Nowadays I use many different networks for communication, such as Skype, MS Live, GTalk, Facebook. The reason is that different people are online in different networks and they're not switching from A to B just if I ask. They have their own existing contacts in their networks.

Still I definitely support anything that can replace Skype, since it's totally closed and there's no way to verify the security. And I hate the Mac client.

-----


Sorry if I came across the wrong way. It's just that I find the "replace program x" idea to be counterproductive. It makes it far too easy to dismiss everything.

Maybe the goal is not to "replace" Skype (that would probably imply duplicating all its features, right?) but to offer alternatives for doing some, maybe even all, maybe even more than you can currently do using Skype.

We cannot just assume that everyone wants to replace other programs/solutions whenever they offer an alternative. It may just be additive.

You have given an example: You use many different networks for different things. None of them are exclusive. You can use them all if you like.

It might be easier to get uptake of something "new" (and nurture that infamous "network effect") if it is not framed as "replacement for enormously popular program x that users already know how to use and which works good enough".

Does that make sense?

If I have a cool alternative OS to share with HN, how far do you think I will get if I frame it as a "replacement for iOS/OSX/Linux"? Not very far. The problem is that even if you do not explicitly claim it is a replacement for anything, HN'ers still assume that's what you are implying by even mentioning it.

This is nuts. If someone offers me an OS/program/solution that can do something I can't currently do with the existing OS/programs/solutions I use, I'm not going to disregard it simply because it duplicates some of the functionality of those programs.

I don't automatically think of the choice as "either-or", I think of it as "should I add this to my options". I ask what can this offer me? Can I split out the functionality in this program that I cannot get in existing programs? But I know many users do view things as either-or. It would be foolish to ignore that. "You can never replace program x." OK, we hear you.

This is why I like small programs that only do one thing. If the user is going to view your using your program as an "either-or" decision (it must _replace_ what they currently use) instead of a "can I use this in addition to what I'm already using" decision, then the chances of deciding not to try your program are significantly increased if your program is some sort of do-everything whiz-bang solution. That's because when you offer so many features, some of those the user is already getting from other existing OS/programs/solutions. They are effectively forced to see things as either-or.

What if, e.g., someone offered just the NAT traversal function of Skype, and someone else offered an encryption program, and yet someone else offered a simple open source command line client (e.g. built with pjsip) that developers could write their own GUI's for? None of them would be trying to "replace" Skype, but using those programs in combination, you could indeed construct a Skype alternative. You might actually be able to do more than Skype can do because it would be a more flexible system. As it stands, you are stuck with that Skype UI. And you're stuck with Microsoft. But if you had an open source command line client that anyone could write a GUI for... and solutions to traversing NAT... and solutions for keeping third parties from tampering/intercepting/eavesdropping...

-----


Using a computer to do the things we normally do, e.g., sending bits to each other, requires that we adopt metaphors. Even something as basic as a "email" is a metaphor.

How long does it take to learn a new metaphor? How difficult is it?

It's far easier to just stick with the metaphors you already know. And that is, I think, what "most" people do.

This applies to more than just computers.

"Some" people might like to keep trying new metaphors every week. Who knows?

One thing is for sure. Everyone learns the "email" metaphor.

Not just some people. Everyone. Food for thought.

I think the author is spot on when he says that in this context (computers) metaphors create limitations rather than educating people about what computers can really do. I even see this among developers who, one would think, are the people resposnible for enabling users to unlock the full potential of their computers. They are stuck on certain metaphors which limit what they can imagine and therefore implement. Independent thinking and striving for originality are in short supply among developers. The attack of the clones never ceases, in case, clones of whatever developers see other developers have done.

However, as insightful as the reference to "skeudomorphism" may be, it's clear the author's goal with this post is trying to downplay Kicksend competitors. Maybe ones that are styled like Instagram? (Polaroid metaphor?)

How many hoops does someone have to jump through to use Kicksend versus using something like Instagram/Facebook (for lack of better examples)? And do they have to pay for the "service"? Maybe that could be a factor?

-----


I don't think everyone learns email as a metaphor though.

While I'm old enough to have sent and received paper mail as a means of communication, it's highly likely that my toddler will have sent email (and internalized the concept) before having interacted with physical mail systems. It's a metaphor for us to call it email but it's a vestigial name to the younger generations.

Think of the floppy disk being a metaphor for saving a file. Few people younger than 15 have used a floppy drive but manage to save files regardless. I also understood radio buttons on web forms as exclusive selections as a young person even though I didn't know it was a metaphor for real buttons on old car radios until a couple of years ago.

-----


But a floppy disk is not a metaphor. The metaphor is a "file" as a description for a series of electromagetic charges on a floppy disk.

It is indeed interesting that children may likely interact with email before postal mail. But postal mail shows no signs of disappearing anytime soon. Probably not during your lifetime and the lifetime of your startup while you're acquiring a critical mass of users.

Do you think it will be possible to conduct 100% of your affairs in life without ever sending or receiving postal mail? That postal mail will never be used by anyone you transact with?

No doubt we may someday reach a "paperless" world where all your affairs can be handled without every using postal mail. And where people never ever use postal mail and have no reason to even know of its existence. Maybe your children may see this come to pass.

But for the purposes of your startup, is that day something you should be concerned about? Is your target market toddlers? Or people old enough to have credit cards, today? People who know how to use email.

-----


I also started using radio buttons long before I learned that they were based on car radios, and once I did so I didn't suddenly start thinking of the buttons as metaphorical; it's just an interesting fact to me. And though I did use floppy disks when I was younger, the idea of a floppy disk no longer even enters my mind when I look for the floppy disk icon; it's just an arbitrary recognizable shape. (The metaphor is "save to floppy disk" as "save to any drive".)

This matters mainly because I won't be bothered if either control starts acting in a way that breaks the metaphor, like a save icon that saves to the cloud (no need to carry a disk around) or a radio button with a design that looks nothing like a car radio (like the current design, as best as I can tell). And it's not black and white: though I've always known a folder is based on a filing cabinet, I've never been bothered that they can be infinitely nested. The more people are familiar with the "metaphor", the less it's necessary to stick to purely metaphorical elements.

-----


I think icons can be arbitrary. That's because I've seen some that are so obviously idiosyncratic to the developer; they bear no relation to the function that I can decipher. Some of them I can't even tell what the heck they are. "WTF is that supposed to be?" It's like your example of the floppy. It's a rectangle. It means save. Does it matter if no one even knows what the heck the icon is supposed to represent? If it's not intuitive? For the first few minutes perhaps until I figure out what the program it represents actually does it matters. Maybe it gives me a clue maybe not. From then on, once I figure it out, it's irrelevant.

This is one of the 1001 reasons I think GUI's are a waste of time. I can just as easily tell a user to hit a particular key (i.e. a tactile button) or type "save". Goodbye ambiguity.

Are icons metaphors? Or are they just symbols?

I am not a linguist but I think that you may be stretching the definition of metapahor if you are thinking of icons as themselves being metaphors.

What is an icon? A button with a superimposed symbol?

Now, if you are saying buttons on a computer screen (which do not necessarily need any symbol superimposed on them to work) are metaphors for physical buttons, e.g. like your example of radio buttons, then that seems a little more reasonable.

I've seen early TV remote controls, 8-tracks and various other old things having push-in buttons just like car radios. I'm not sure car radios were the first to have these. Maybe early radios, before TV, were the first to have push-in buttons (or whatever the proper name for them is)?

-----


Typing "save" is very far from being unambiguous though. Should it be "save", "store", "write down", "file away", "preserve" or another one of a few dozen ways an unsuspecting person could come up with? Should you type "please" too? At least with GUI there is a button that semi-obviously can be clicked, even if it is unclear what it does. Discoverability is way better with GUI.

-----


What you find ambiguous may not align with what others find ambiguous. You, as a nerd, know there are many options that "save" could entail. Does everyone else know that?

What if a user has no idea about "save options". To them "save" might just mean "save". That is, they want to be able to retrieve it later.

If I say to you "Jump!", you might ask "How high?" Others might just jump. Should we enlist some participants in a study and see what most people do? Let's ask people what "save" means. Then let's show them a button and ask them what it means.

Why does some software have an option to "show button text"? Why would anyone want to see text on a button? Why would anyone want that?

I wish there was a way I could post to HN in a series of buttons, like hieroglyphics. It would be "way better" than using words. You could click on them and they would bounce up and down. Way better than reading.

-----


Asking users to participate in "two-factor authentication" seems like a great way to match people's personal information to particular devices.

So maybe we have a double-edged sword here. If you want to be able to authenticate you have to give some company the ability to track you and monitor all your activity (which they will try to "monetize"). It sounds sort of tinfoil hat but this is what we are facing.

The reason: We insist on using the web and other "client-server" approaches for almost everything we do using the internet, instead of considering end-to-end, peer-to-peer approaches. Things are so insecure when everyting goes (mostly) unencrypted over the open web via middleman (Facebook servers, Gmail servers, etc.) that we need to try things like "two-factor authentication".

-----


Applications are open for YC Winter 2016

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: