No one has any intention of diminishing usability or making it hard manipulate URLS. The team working on this is still actively refining things and studying what works and what doesn't. But, phishing is a very big problem, and this change to the omnibox shows real promise in countering the attacks. So, I think we would be remiss in not pursuing the investigation further.
There is nothing more that can be productively argued about this topic. There will be analogies about how complexity is hidden in various domains (cars, computers) and how beneficial it has been and how users are happy with it. Those arguments are fine and maybe they are being made in good faith, but it doesn't change the underlying future truth:
Marketing will now be changed to reflect Google keywords, not URLs. "www." and ".com" will become meaningless. Google will have put one more level of distance between what the users type in the URL and even what they click in the browser and what is reflected in the address bar.
The issue isn't users recognizing path, it's the domain. It's also that they aren't taking special care while logging in.
Additionally, what about addressing insecure forms that fail to utilize https. Chrome is already detects login forms. So just warn users by turning the origin chip to a red background when they are on a login page.
On the whole, supporting secure logins would be better for the internet.
The team may choose to do something like that in the end. That's really the point of experimenting with different approaches; they use them to get feedback, run user studies, and get a sense of what works best.
> Additionally, what about addressing insecure forms that fail to utilize https. Chrome is already detects login forms. So just warn users by turning the origin chip to a red background when they are on a login page.
That's actually very, very hard (as in np-hard). Chrome has heuristics for detecting login pages, but it doesn't even detect all legitimate login pages. And it's trivial for a phisher to intentionally make a page that appears exactly like a login page to the user, but will not be detected by Chrome's heuristics.
Well, it wouldn't have to detect all login pages, it could just detect most of them. That would add soft pressure to encourage regular websites to use HTTPS in the vein of
Hopefully we can push the web towards https everywhere, and users begin to ask the question -- "Why is this page not secure" when performing a login.
This soft pressure worked wonders in a number of places: When Google started doing sitelinks, many websites became much more concerned about making those available on their website. In a similar way, hopefully they would be concerned with getting out of the red for their login pages.
It's not as simple as peolpe think of it. It never is.
FWIW, Firefox detects insecure login forms and emits a security warning to the web console. This is aimed at developers, however, not users (because the developers are the only ones who can improve the situation).
Our heuristic is imperfect as well. We simply detect <input type="password"> fields on http pages. This works well enough. Trying to detect when developers are abusing <input type="text"> for a password field is non-trivial.
There's also mixed content blocking³ which is a similar but user facing feature.
On security, I feel its a problem sure, but teaching users about URLs is a bandaid fix, a hack and completely irrelevant to this change. You can't fix the phishing problem by showing URLs. It needs to be tackled in the proper manner and solved silently from the user.
Google already does the right thing when you use them for DNS (redirects on mispelt). They also implemented blacklists of dodgy content, not ideal and doesn't scale but its a good start. Better than claiming the user is at fault.
Basically if you need to claim the user is at fault your design is wrong whether you like it or not.
or even simply
I don't even know how he generates those URLs and he didn't respond when I asked why he does it.
although, I'll note that once again, the internet is already on the case. The link you were looking for was http://lmgtfy.com/?q=keyword
Most people have little idea what an URL is, nor how to use an address bar.
Watch a few people navigating to http://www.example.com/example.html to see what they do.
I personally have never understood the huge sums of money paid by some people for some URLs. I can't remember URLs; I can't remember keywords. What I can remember is some fragment of a page title, and I hope that's enough to find it in my bookmarks or in a websearch.
I wish Google would just release some numbers about the numbers of people who get this stuff wrong, because learning how many people use (correctly) the + operator made that change much easier to cope with.
Is that ever a good thing to say?
The change is good. I never saw any mention of people disliking this in Mobile Safari, and it's an option power users will turn off. There's plenty of them already.
Is it? What are the numbers?
So many annoying things are done in the name of "security" without much justification (both online and in real life); Chrome removed the protocol for no reason and Firefox felt obliged to do the same, and now we're removing the whole url just to help folks who can't be bothered to read it?
Another comment suggests "source code highlighting" for the url, where the actual domain would be emphasized while still showing the whole url: what are your thoughts on this?
(Disclaimer: I have not tried the Chrome version in question so I very well may not know what I'm complaining about.)
Providing more detailed background and better numbers on phishing sounds like a good idea. However, this is an experiment that is not on track to ship in any version of Chrome, so I wouldn't consider it a gating criteria for continuing to experiment.
> Chrome removed the protocol for no reason and Firefox felt obliged to do the same, and now we're removing the whole url just to help folks who can't be bothered to read it?
The decision to not display the HTTP scheme was not related to security (and it wasn't to my personal liking). However, it was done in such a way that it was effectively security neutral.
> Another comment suggests "source code highlighting" for the url, where the actual domain would be emphasized while still showing the whole url: what are your thoughts on this?
As I commented elsewhere, Chrome has been highlighting the origin since its inception. However, phishing is still a serious problem, so the team that's working on this is investigating clearer indicators like the ones in this experiment.
So, through obscurity you will achieve security.
Edit: or you expect users to notice phishing attempts more clearly by only displaying the domain name?
It's all just GUI change, nothing is really obscured (as in not accessible). It's just more radically highlighted.
The intent isn't to hide the URL from technical people like scammers... Its to hide it from nontechnical people. It's easier to train your grandparents to look at a shortened URL containing just the domain name, and having them verify that is what they expect - as all the distracting bits of the URL are gone...
(Also - "crackers" refers to a very different group of people....)
Dear John Doe,
An automatic payment has been made from your checking account:
04-May-2014 $125.00 to Big Energy Utility Corp
For more details, please log on to your online banking account and click the "Recent Transactions" tab.
There is no need for hyperlinks in any of that.
To verify your email, please login at SiteYouHaveJustRegisteredAt
and enter the following information:
User ID: 1234
Verification code: 12345678
This is already the case in Firefox. I don't know how long they've been doing it, but I think it's the perfect compromise.
Except that's exactly what's being done here. You mean that no one is twirling their moustache saying "muahahaha, soon we will make search the only way to navigate the web!" like a bond villain. Of course not, that's stupid. That's not how things happen.
People are concerned that the intention is to make changes to chrome that value the primacy and visibility of URLs fairly low compared to other factors. Everyone involved is well meaning (especially with the "won't somebody please think of the elderly!"-ness of the anti-phishing agenda), they just have an unavoidable institutional bias. It's not a coincidence that these all make the google search bar more prominent. That's valued very highly.
If your criteria value X relatively lowly and you evaluate a bunch of things which are judged with those criteria you end up diminishing X. No one cares that you didn't "intend" to do it, they care that you are doing it.
Also, while reading Jake's post again, I realized that having the URL path colored in grey was already a big hint that something was off. Going all the way to white doesn't make a valuable difference; maybe using blue rather than grey would make the difference more striking.
I'd love to know exactly how the experimentation works. Do you perform user surveys? How do they work? What is the criterion that decides that any particular experiment failed and should be scraped off?
chrome is already breaking half of the copy/pastes i do because it's from my history and not a page that's already loaded.
do you have any idea how many people actually use copy/paste? i would venture that ctrl-c/ctrl-v is probably the only key binding that the majority of computer users of all walks of life use.
half of the places i seem to be pasting links into don't pick up the links without the http. Including things like markdown in github issues, many chat systems.
Messing with the url bar when you did it last time has been a constant daily sense of irritation to me. So no. I don't want you to mess with it any further.
just please stop.
I'd prefer to reference the destination URLs and wsites in the documentation and related materials I was creating, and the only way to do that with various Google web properties was to visit the destination site and cut and paste from the browser from there — otherwise I'd end up with a Google "link shortener" obfuscating the domain, and possibly also eventually interrupting the connection if (when?) Google decided to discontinue or update the redirector service.
Further, this Google practice filled the browser history with intermediate URLs, which rendered the browser history far less useful.
I would not expect Google to stop these behaviors, though. Switch search engines. Vote with your clicks and with the data you (don't) expose to Google.
The F6 functionality is the reason I welcome this, Chromium devs aren't really taking control from us more tech savvy users, they're making it easier for regular users to spot the relevant part of a URL.
I already use F6 for all interaction with the omnibar as it is. Also, I'm not sure which browser it was (Opera or Firefox), but I distinctly recall either of those a few years ago having the exact same functionality discussed here, where only the domain was shown unless the URL field was active.
URLs are the bread-and-butter of the web. Surely, you don't have to hide the whole URL? Why not simply show the whole URL but visually emphasise the domain in some way so it stands out. Make it easy to read the whole URL while emphasizing the domain (i.e. don't fade out the rest of the URL so its too faint to read).
There are other ways of tackle phishing too. If most phising occurs when you click links in your email, then email providers could display an intermediary page before you're taken to the destination link. The intermediary page tells you the domain you're being taken to: you click a PayPal link and the intermediary page states you are about to be taken to sharkventures. This could of course be very annoying for every web link and some users won't read email re-direct messages, but it's another approach.
I appreciate trying to make Chrome more secure, but please don't forget about the developers. Annoy them too much, and they might move their development to another platform.
Firefox and IE have started copying Chrome in appearance and philosophy. Opera is now based on the same code. Safari is based on similar code, and is unusable by anyone not using a Mac of some sort.
Firefox is perhaps the most viable alternative to Chrome for web developers or advanced users. But due to that lack of competition or original thought, it really isn't much different from Chrome at all. This has become very apparent with the release of Firefox 29, where they both now look nearly identical, and both suffer from the same dumbing-down that has harmed the experience for more advanced web users.
I just tried to reproduce it, and it doesn't seem to actually happen for every new window I open, so I am not sure what I am doing to trigger it, but I do encounter it several times a day.
edit: Incognito seems to do it, which makes sense because I often enter/leave incognito to make sure I have a fresh cache & no cookies when testing.
Right now you (and other browsers, too) give the "safe" color Green to to the bare minimum of https security, even if it uses TLS 0.9.8 and RSA 1024-bit. Sites are not going to upgrade their security policies as long as you, the browser vendors, keep showing them to the users as "perfectly safe". There's no incentive in it. I think it's on browser vendors to push some of the incentives.
The URL is intact, and the root domain stands out clearly.
Firefox's implementation of this feature is at least partially because of its usage in Chrome and other browsers: https://wiki.mozilla.org/Firefox/Features/Locationbar_Domain...
The main point is that there are ways of going about this without having to kill the URL bar.
Getting rid of the URL makes users effectively out of touch with 'where' they are on the web and within a website.
A browser is a navigation device, a browser without a sense of location is the digital equivalent of being lost.
My point in the last thread is that for me, personally, I edit the URL a lot. I'm writing HTML5 games. During dev I use the URL to edit/set parameters as in `http://mygame.com/numPlayers=3&ai=true&runSpeed=47`. Having the URL hidden will make it harder to edit that. Sure I can click the "show me the URL button first" it will just be annoying because I can't directly target the `47` like I can now since I won't be able to see it until after I click "show me the URL".
So, just like only devs use dev tools I'd prefer if there was a way to let me see the URL constantly. Like maybe if devtools is open the URL shows? Or maybe it's another options in dev tools just like 'disable cache if dev tools is open'? I'm sure people working with history.pushState would also like to be able to see the URL live.
Do what's best for users in general. Just please help us devs to dev too :)
This is similar to what happens when the path is too long for the omnibox, but simply the effect of putting the domain as far left (in the omnibox) as possible.
i.e. instead of seeing:
[ www.mybank.com.credicards.wt3_segment_secure.login.html.evil-site.com ]
[ ...html.evil-site.com/ ] (where "...html." are semi-opaque).
I mocked up what I'm suggesting here: http://remysharp.com/2014/05/04/on-chrome-hiding-urls-to-pro...
In particular, I generally like the change, but it bothers me that I need to hit the new button in order to see the URL, instead of clicking anywhere in the omnibox. I'd rather see a better-phrased version of "Search Google or interact with URL" in the omnibox, and get the whole URL on click anywhere in there. I could imagine then making a click on the button pull up the security details, just like what happens when you click the lock currently.
Also, it'd be nice if you showed the whole URL when hovering over the button.
I agree that this would be a big win.
If this type of rendering were in place it would protect the user in most cases even when a secondary issue like the one I reported exists in the full rendering code. Well worth the minor inconvenience of having to click the 'chip' to show the full url if you happen to need to see/copy it.
I know one thing though it would prevent - if Chrome keeps doing such things that actually may force me to switch back to Firefox. Hiding URLs IMO is a horrible idea.
I've put together a quick repo to PoC an idea that maps a chain to an html color, but I have no idea where to incorporate it into a UI.
Here are some things that actively mitigate phishing; many of them available in Chrome and actively used by many web properties (including Google's).
HttpOnly cookies (introduced in 2002!)
Content Security Policy
isolating user input to low-privilege domains to protect unsecured user information
clear, identifiable URLs that increase the odds of users recognizing something wrong
What is an example of a phishing attack or XSS attack that would be stopped by this change? Is there at least an example of an attack that would be mitigated? I cannot for the life of me think of one.
Phishing is when an evil website tricks you into typing your bank password into it. HttpOnly cookies (as an example) are not going to do anything to prevent an evil website from looking like your bank's website.