As a member of the Chrome security team and one of the original instigators for this experiment, yes the whole point is to prevent phishing. The fact is that phishing is one of the most common attack vectors for most people, and the way the URL is currently displayed does very little to protect them. So, we're experimenting with ways of displaying the essential information (origin and TLS state) as clearly as possible, while removing the components that are not security relevant and are currently being abused to trick users.
No one has any intention of diminishing usability or making it hard manipulate URLS. The team working on this is still actively refining things and studying what works and what doesn't. But, phishing is a very big problem, and this change to the omnibox shows real promise in countering the attacks. So, I think we would be remiss in not pursuing the investigation further.
So phishers buy domains with a levenshtein distance of 1 or two. It solves one problem, but creates an entire class of users that don't understand what a URL is. Who benefits? Google and search engine providers because now they can manipulate future internet users to believe that search engines are the internet. We've reverted to AOL in 1995.
There is nothing more that can be productively argued about this topic. There will be analogies about how complexity is hidden in various domains (cars, computers) and how beneficial it has been and how users are happy with it. Those arguments are fine and maybe they are being made in good faith, but it doesn't change the underlying future truth:
Marketing will now be changed to reflect Google keywords, not URLs. "www." and ".com" will become meaningless. Google will have put one more level of distance between what the users type in the URL and even what they click in the browser and what is reflected in the address bar.
I agree that this experiment isn't demonstrating a perfect mitigation, but it's important to appreciate that it's currently vastly easier for a phisher to permute paths and subdomain components than it is to create a convincing ETLD+1. There are various reasons for this, including less text for a phisher to work with and registration requirements for ETLD+1 domains (which means they can't be iterated and dumped as quickly, and domain owners may have a more immediate legal recourse against phishers). That's the point of experimenting on features like this, to get an idea of whether or not the they would be beneficial.
The issue isn't users recognizing path, it's the domain. It's also that they aren't taking special care while logging in.
Additionally, what about addressing insecure forms that fail to utilize https. Chrome is already detects login forms. So just warn users by turning the origin chip to a red background when they are on a login page.
On the whole, supporting secure logins would be better for the internet.
The team may choose to do something like that in the end. That's really the point of experimenting with different approaches; they use them to get feedback, run user studies, and get a sense of what works best.
> Additionally, what about addressing insecure forms that fail to utilize https. Chrome is already detects login forms. So just warn users by turning the origin chip to a red background when they are on a login page.
That's actually very, very hard (as in np-hard). Chrome has heuristics for detecting login pages, but it doesn't even detect all legitimate login pages. And it's trivial for a phisher to intentionally make a page that appears exactly like a login page to the user, but will not be detected by Chrome's heuristics.
> That's actually very, very hard (as in np-hard). Chrome has heuristics for detecting login pages, but it doesn't even detect all legitimate login pages. And it's trivial for a phisher to intentionally make a page that appears exactly like a login page to the user, but will not be detected by Chrome's heuristics.
Well, it wouldn't have to detect all login pages, it could just detect most of them. That would add soft pressure to encourage regular websites to use HTTPS in the vein of
Hopefully we can push the web towards https everywhere, and users begin to ask the question -- "Why is this page not secure" when performing a login.
This soft pressure worked wonders in a number of places: When Google started doing sitelinks, many websites became much more concerned about making those available on their website. In a similar way, hopefully they would be concerned with getting out of the red for their login pages.
But `input[type="password"]` would cover the majority of login forms. At the least, it would force the con-artist have to mimic a native password box, which is more likely to get caught by the end-user.
You could detect forms that have more than one user provided input field as they are submitted. This wouldn't detect most search boxes or forms that use hidden inputs to transmit values to the backend.
> Additionally, what about addressing insecure forms that fail to utilize https.
FWIW, Firefox detects insecure login forms and emits a security warning to the web console. This is aimed at developers, however, not users (because the developers are the only ones who can improve the situation).
Our heuristic is imperfect as well. We simply detect <input type="password"> fields on http pages. This works well enough. Trying to detect when developers are abusing <input type="text"> for a password field is non-trivial.
How long has Firefox been logging security warnings (in the web console) for insecure login forms? Do you know of telemetry about how prevalent the problem is or whether the security warnings have helped? I assume, for compatibility, Firefox will never be able to simply reject insecure login forms.
I really like that. Preserves all the benefits of hiding the path entirely without... hiding the path entirely. However, how would the browser tell the difference between editing the path and entering an entirely new url or search? It wouldn't be intuitive if you have to click the domain chip to go to a different site. Most of the time a user is going to click the path area in that screenshot, since that's what we're used to. But then, how do you edit just the path?
I'd say that is the best solution, possibly with more work done on the certificate button left of the url. The button currently highlights “good“ sites: https with an Extended Validation certificate. It should be the other way around. Highlight as warning when the site falls short: no SSL or SSL without EV.
I don't know if such a thing already exists, but perhaps a way for a user to define 'secure urls' that go to their bank (or other important site) would be useful. Chrome could then warn when a user goes to a site that looks too similar.
There is more that can be productively argued about this topic, at least for parties who decide to not insist otherwise. URLs won't go away as long as people are still sharing websites on social networks or their own websites. I don't see the problem with not displaying the entire URL at the top of the browser window, if no actual functionality it lost. I this case, there's not even any additional clicks required to perform the same operations with URLs.
Came here to say similar. In addition I dont see any problem. The URL bar is a redundant usability smell. Needs to be cleaned up in the fashion they have.
On security, I feel its a problem sure, but teaching users about URLs is a bandaid fix, a hack and completely irrelevant to this change. You can't fix the phishing problem by showing URLs. It needs to be tackled in the proper manner and solved silently from the user.
Google already does the right thing when you use them for DNS (redirects on mispelt). They also implemented blacklists of dodgy content, not ideal and doesn't scale but its a good start. Better than claiming the user is at fault.
Basically if you need to claim the user is at fault your design is wrong whether you like it or not.
I agree with the possibility for productive debate, my comment is probably a bit too cynical. I just see a lot of redundant arguments, bad analogies and strawmen coming up soon and jumped the gun.
Well, if www and .com become meaningless then advertisers will use "type xxx yyy into you bar" instead (which goes to the search engine) or hash tags (already doing this). Then Google could come up with "associate permanent keywords with your URL" (for a small fee of course) to guarantee that those keywords won't shift under you when your Google ranking changes.
Those keywords are not supposed to live long: they are used as part of marketing campaigns. But your comment is still valid. Such a strategy would be very risky in the long term.
Agreed. It also solves another problem in Japan, namely that brand names etc are usually written in Japanese letters, but URLs in ASCII (yes, I'm aware of IDN, but haven't ever seen it used). This is obviously not limited to Japan, I wouldn't be surprised if other countries used the same trick.
Great link... Seeing people communicate "google yourbank" and address bar becoming a search box, I thought href could be extended as an innovative technology to allow intermediated linking through a search engine.. (or Url shorteners with keywords)... But AOL already had it...
no it wouldn't. You might put that in an ad (and as people above note, this is already happening), but this only changes what's displayed in the location bar. It would be stupid to link to another site through a web search when you can just link to the site.
although, I'll note that once again, the internet is already on the case. The link you were looking for was http://lmgtfy.com/?q=keyword
I personally have never understood the huge sums of money paid by some people for some URLs. I can't remember URLs; I can't remember keywords. What I can remember is some fragment of a page title, and I hope that's enough to find it in my bookmarks or in a websearch.
I wish Google would just release some numbers about the numbers of people who get this stuff wrong, because learning how many people use (correctly) the + operator made that change much easier to cope with.
>There is nothing more that can be productively argued about this topic.
Is that ever a good thing to say?
The change is good. I never saw any mention of people disliking this in Mobile Safari, and it's an option power users will turn off. There's plenty of them already.
So many annoying things are done in the name of "security" without much justification (both online and in real life); Chrome removed the protocol for no reason and Firefox felt obliged to do the same, and now we're removing the whole url just to help folks who can't be bothered to read it?
Another comment suggests "source code highlighting" for the url, where the actual domain would be emphasized while still showing the whole url: what are your thoughts on this?
(Disclaimer: I have not tried the Chrome version in question so I very well may not know what I'm complaining about.)
Providing more detailed background and better numbers on phishing sounds like a good idea. However, this is an experiment that is not on track to ship in any version of Chrome, so I wouldn't consider it a gating criteria for continuing to experiment.
> Chrome removed the protocol for no reason and Firefox felt obliged to do the same, and now we're removing the whole url just to help folks who can't be bothered to read it?
The decision to not display the HTTP scheme was not related to security (and it wasn't to my personal liking). However, it was done in such a way that it was effectively security neutral.
> Another comment suggests "source code highlighting" for the url, where the actual domain would be emphasized while still showing the whole url: what are your thoughts on this?
As I commented elsewhere, Chrome has been highlighting the origin since its inception. However, phishing is still a serious problem, so the team that's working on this is investigating clearer indicators like the ones in this experiment.
If black vs. light grey is your concept of "highlighting" then I have to tell that I didn't even notice the behavior until today. Maybe you should try actual color before removing functionally?
It's the latter. By only displaying the domain name users are more likely to notice a scam domain name, e.g. the Halifax.co.uk case in the parent blog post.
The intent isn't to hide the URL from technical people like scammers... Its to hide it from nontechnical people. It's easier to train your grandparents to look at a shortened URL containing just the domain name, and having them verify that is what they expect - as all the distracting bits of the URL are gone...
(Also - "crackers" refers to a very different group of people....)
I feel I'm fairly immune to traditional phishing because I never click a URL in an email. If namecheap sends me an email saying one of my domain names is about to expire, I don't use their "Renew Now" link I go to "namecheap.com" (not hard to type), log in, and renew the name. Banks and other organizations that send long complicated links in emails and encourage people to click them are part of the reason phishing is possible. These emails should look like:
Dear John Doe,
An automatic payment has been made from your checking account:
04-May-2014 $125.00 to Big Energy Utility Corp
For more details, please log on to your online banking account and click the "Recent Transactions" tab.
Yeah, part of me wonders how bad it would be / what would break if email clients banned outside links (e.g. beyond fragments in the email). My suspicion is that the only useful use of a link is a confirmation email which have other potential implementations...
plus it forces web sites to be designed in such a way to easily locate what you need, e.g. "recent transactions" shouldn't be buried deeply in the navigation tree.
"No one has any intention of diminishing usability or making it hard manipulate URLS"
Except that's exactly what's being done here. You mean that no one is twirling their moustache saying "muahahaha, soon we will make search the only way to navigate the web!" like a bond villain. Of course not, that's stupid. That's not how things happen.
People are concerned that the intention is to make changes to chrome that value the primacy and visibility of URLs fairly low compared to other factors. Everyone involved is well meaning (especially with the "won't somebody please think of the elderly!"-ness of the anti-phishing agenda), they just have an unavoidable institutional bias. It's not a coincidence that these all make the google search bar more prominent. That's valued very highly.
If your criteria value X relatively lowly and you evaluate a bunch of things which are judged with those criteria you end up diminishing X. No one cares that you didn't "intend" to do it, they care that you are doing it.
My first reaction when I saw this change was, 'Oh, good, this website uses HTTPS now'. If I wasn't tech savvy, I would probably not have noticed that I was wrong. Given that, when I learned what had been done, the first thought that popped in my head was, 'Phishing only has shifted from one form of deceit to the next'.
Also, while reading Jake's post again, I realized that having the URL path colored in grey was already a big hint that something was off. Going all the way to white doesn't make a valuable difference; maybe using blue rather than grey would make the difference more striking.
I'd love to know exactly how the experimentation works. Do you perform user surveys? How do they work? What is the criterion that decides that any particular experiment failed and should be scraped off?
chrome is already breaking half of the copy/pastes i do because it's from my history and not a page that's already loaded.
do you have any idea how many people actually use copy/paste? i would venture that ctrl-c/ctrl-v is probably the only key binding that the majority of computer users of all walks of life use.
half of the places i seem to be pasting links into don't pick up the links without the http. Including things like markdown in github issues, many chat systems.
Messing with the url bar when you did it last time has been a constant daily sense of irritation to me. So no. I don't want you to mess with it any further.
Something similar happened with various other of Google's properties a while back — Google's use of intermediate redirector URLs that broke cut-and-paste.
I'd prefer to reference the destination URLs and wsites in the documentation and related materials I was creating, and the only way to do that with various Google web properties was to visit the destination site and cut and paste from the browser from there — otherwise I'd end up with a Google "link shortener" obfuscating the domain, and possibly also eventually interrupting the connection if (when?) Google decided to discontinue or update the redirector service.
Further, this Google practice filled the browser history with intermediate URLs, which rendered the browser history far less useful.
I would not expect Google to stop these behaviors, though. Switch search engines. Vote with your clicks and with the data you (don't) expose to Google.
Why should it make copying, pasting, or editing URLs more difficult? Right now in stable Chrome, you click once on the URL bar and it selects the entire URL. Then you can Ctrl-C to copy, Ctrl-V to paste, or click again to put the cursor in a specific place to edit it. That shouldn't be any different with this new Chrome feature they're testing.
The F6 functionality is the reason I welcome this, Chromium devs aren't really taking control from us more tech savvy users, they're making it easier for regular users to spot the relevant part of a URL.
I already use F6 for all interaction with the omnibar as it is. Also, I'm not sure which browser it was (Opera or Firefox), but I distinctly recall either of those a few years ago having the exact same functionality discussed here, where only the domain was shown unless the URL field was active.
URLs are the bread-and-butter of the web. Surely, you don't have to hide the whole URL? Why not simply show the whole URL but visually emphasise the domain in some way so it stands out. Make it easy to read the whole URL while emphasizing the domain (i.e. don't fade out the rest of the URL so its too faint to read).
There are other ways of tackle phishing too. If most phising occurs when you click links in your email, then email providers could display an intermediary page before you're taken to the destination link. The intermediary page tells you the domain you're being taken to: you click a PayPal link and the intermediary page states you are about to be taken to sharkventures. This could of course be very annoying for every web link and some users won't read email re-direct messages, but it's another approach.
That's what I'm thinking too. Why not make the domain rainbow colored and blinking if that's what it takes to stand out. It doesn't mean the rest of the url has to be absent entirely.
As a developer, if this is going to hide any useful information on first glance I am not sure how I feel about that. I already feel like Chrome has started shunning developers with that over the top annoying pop-up any time I open a new window (Ctrl+N, type, stop typing because I have to move my mouse to close the popup), and moving towards forcing developers to distribute their extensions through the play store (which kills any small-time extensions for tiny communities & friends).
I appreciate trying to make Chrome more secure, but please don't forget about the developers. Annoy them too much, and they might move their development to another platform.
A big part of the problem is that while there are competing browsers, we just don't see any real competition these days.
Firefox and IE have started copying Chrome in appearance and philosophy. Opera is now based on the same code. Safari is based on similar code, and is unusable by anyone not using a Mac of some sort.
Firefox is perhaps the most viable alternative to Chrome for web developers or advanced users. But due to that lack of competition or original thought, it really isn't much different from Chrome at all. This has become very apparent with the release of Firefox 29, where they both now look nearly identical, and both suffer from the same dumbing-down that has harmed the experience for more advanced web users.
What is the popup you're encountering? Because it sounds like a bug, and I'd like to make sure someone is working on fixing it (or has already done so).
I just tried to reproduce it, and it doesn't seem to actually happen for every new window I open, so I am not sure what I am doing to trigger it, but I do encounter it several times a day.
edit: Incognito seems to do it, which makes sense because I often enter/leave incognito to make sure I have a fresh cache & no cookies when testing.
The custom extension popup, and you dont want to fix it(status is wontfix) ,where a simple checkbox in the settings could have fixd it... you made our lives miserable.
You want to help people feel more secure? How about you downgrade every https website to Yellow color, unless it uses PFS (only with a few good cipher-suits) and the latest version of TLS. Requiring HSTS would probably be good, too. They should also be required to have 100 percent of the site under https, before getting the Green color. No mixed content.
Right now you (and other browsers, too) give the "safe" color Green to to the bare minimum of https security, even if it uses TLS 0.9.8 and RSA 1024-bit. Sites are not going to upgrade their security policies as long as you, the browser vendors, keep showing them to the users as "perfectly safe". There's no incentive in it. I think it's on browser vendors to push some of the incentives.
What percentage of big websites (say, top 100 sites and top 100 e-commerce sites) use HTTPS without PFS? If it's significant, then the yellow will just become meaningless.
It's a start but I hadn't noticed this feature and wouldn't notice if the whole URL were in black. Using different colours to highlight different parts of the URL would be a big improvement.
It's bad enough that google makes it impossible to cut-and-paste links from search results (try sending the URL of a pdf you've found using google to a colleague, especially nice if it is longer than the maximum bit displayed as text).
Getting rid of the URL makes users effectively out of touch with 'where' they are on the web and within a website.
A browser is a navigation device, a browser without a sense of location is the digital equivalent of being lost.
Does the kind of person who falls victim to a phishing attack pay attention to anything on the URL bar anyway? Are they going to have any idea what the green padlock thing or "secure connection" even mean in the context?
As usual I put my foot in my mouth on the last HN thread. This sounds like a good idea.
My point in the last thread is that for me, personally, I edit the URL a lot. I'm writing HTML5 games. During dev I use the URL to edit/set parameters as in `http://mygame.com/numPlayers=3&ai=true&runSpeed=47`. Having the URL hidden will make it harder to edit that. Sure I can click the "show me the URL button first" it will just be annoying because I can't directly target the `47` like I can now since I won't be able to see it until after I click "show me the URL".
So, just like only devs use dev tools I'd prefer if there was a way to let me see the URL constantly. Like maybe if devtools is open the URL shows? Or maybe it's another options in dev tools just like 'disable cache if dev tools is open'? I'm sure people working with history.pushState would also like to be able to see the URL live.
Do what's best for users in general. Just please help us devs to dev too :)
Why not, instead, show an ellipsis in place of the majority of the subdomain leading up to the domain and increase opacity of the domain compared to subdomains and the pathname.
This is similar to what happens when the path is too long for the omnibox, but simply the effect of putting the domain as far left (in the omnibox) as possible.
I would prefer if the hostname was clearly separated and highlighted with the path after. I don't really feel the need to be able to instantly enter a new url outweighs my desire to see what URL I currently am on.
@justinschuh: what's the best way to provide feedback?
In particular, I generally like the change, but it bothers me that I need to hit the new button in order to see the URL, instead of clicking anywhere in the omnibox. I'd rather see a better-phrased version of "Search Google or interact with URL" in the omnibox, and get the whole URL on click anywhere in there. I could imagine then making a click on the button pull up the security details, just like what happens when you click the lock currently.
Also, it'd be nice if you showed the whole URL when hovering over the button.
If this type of rendering were in place it would protect the user in most cases even when a secondary issue like the one I reported exists in the full rendering code. Well worth the minor inconvenience of having to click the 'chip' to show the full url if you happen to need to see/copy it.
I'm not sure how this would prevent phishing. 99% of people affected by phishing don't know much about URLs anyway, and legit sites routinely use weird domains like "onlinecredit6.com" or "xyznetaccess.com" which are indistinguishable from phishing ones.
I know one thing though it would prevent - if Chrome keeps doing such things that actually may force me to switch back to Firefox. Hiding URLs IMO is a horrible idea.
Calling this a phishing mitigation is blatantly dishonest. Hiding the URL does nothing to stop phishing. UX improvements are great, and this is clearly a UX change designed to perform great on metrics Google cares about (like search traffic), but it is not anything resembling an attack mitigation.
Here are some things that actively mitigate phishing; many of them available in Chrome and actively used by many web properties (including Google's).
HttpOnly cookies (introduced in 2002!)
'secure' cookies
Content Security Policy
iframe sandboxing
input sanitization
isolating user input to low-privilege domains to protect unsecured user information
clear, identifiable URLs that increase the odds of users recognizing something wrong
two-factor authentication
What is an example of a phishing attack or XSS attack that would be stopped by this change? Is there at least an example of an attack that would be mitigated? I cannot for the life of me think of one.
Not one of the things you list helps phishing. I think maybe you are confusing phishing with other types of attacks.
Phishing is when an evil website tricks you into typing your bank password into it. HttpOnly cookies (as an example) are not going to do anything to prevent an evil website from looking like your bank's website.
So this came from the same genius who came up with the "checkbox to show all your passwords in clear text without any further safeguards" feature? Man, you're a menace to the web.
No one has any intention of diminishing usability or making it hard manipulate URLS. The team working on this is still actively refining things and studying what works and what doesn't. But, phishing is a very big problem, and this change to the omnibox shows real promise in countering the attacks. So, I think we would be remiss in not pursuing the investigation further.