
Google Takes Its First Steps Toward Killing the URL - glassworm
https://www.wired.com/story/google-chrome-kill-url-first-steps/
======
move-on-by
I’m against chrome taking always the URL. While they make points about
security and dumb users, I don’t believe they have altruistic motives.

An issue with AMP is the terrible URLs- because they are hosted by google.
Once Chrome no longer has URLs, then Chrome gets to decide what name to show
you in the address bar - potentially a name that has very little to do with
the actual location of the document. Maybe I’m being a bit extreme or
pessimistic, but do not believe this is a good change for the web, and I don’t
think Google can be trusted to be the stewards of the internet.

~~~
ken
They don't say anything about "dumb users", and I wish the conversation around
this didn't even use that phrase. Hardhats aren't just for clumsy people,
seatbelts aren't just for bad drivers, and memory protection isn't just for
careless programmers. There's no shame in engineering a situation to maximize
safety for the people who will use it.

There's a whole lot of smart people who like static typing for the safety
benefits it brings to their programming. Web browsers are the front lines:
where one click on a stringly-typed reference can cause unknown third-party
code to run on your computer. You are protected only by your human ability to
spot the visual difference between two strings, one of which is constructed by
someone trying their hardest to trick you. If you advocate for type safety
within software, you should also advocate for a better system than URIs.

~~~
SllX
Or advocate for disabling the ability for your web browser to run 3rd party
code just by clicking a link.

Google is not trustworthy because between Chrome and Search they have too much
of a stake in the eventual outcome of anything that could replace URLs. Any
system that eventually does replace URIs should be able to tell me at a
glance: is it http, ftp or a local file? Is there a TLS cert? What is the
domain of the server I am accessing? Approximately where in the site directory
am I?

I simply don’t trust browsers or sites which try to obscure what is actually
useful information because some UX guru told them it was unnecessary. If
anything, I want URLs with even more information. Which version of “http” is
in use would be a nice start at this point.

~~~
mrhappyunhappy
Feels like every point you mention can be designed in a user friendly way. I’m
a UX designer and have some ideas for this.

~~~
TeMPOraL
Please write them down and publish somewhere; we need more proposals that
empower users to be out there to see and discuss.

------
zestyping
If I'm understanding correctly, it sounds like the plan is to use heuristics
and machine learning to guess which URLs look "tricky".

I'm highly skeptical of an approach that involves training users to rely on a
black-box ML system. That just makes them ever more dependent on technology
they can't possibly understand and puts more power in Google's hands. By being
the sole arbiter of what is "tricky," Google gets to blacklist the entire
Internet.

It would be better to help users understand the URL. I don't mean expecting
users to parse the syntax on sight; I mean finding ways to display or
represent it so that the important information is easier to see and fraud is
easier to spot.

~~~
aaronmdjones
Along the lines of your last thought (helping users understand the URI,
displaying it in an idiot-proof manner), this wouldn't be at all hard if they
simply had 4 separate areas for protocol, hostname if any, domain & public
suffix combined, and path. e.g.:

    
    
        https://www.example.net/foo.html
    

becomes:

    
    
        [https] [www] [example.net] [foo.html]
    

Then they can colour the public suffix e.g. black and the rest of it light-
grey, much like they do already, BUT it's _also_ clear which box you always
need to look at to determine the site's identity.

It could go even further and obscure the contents of the first, second, and
fourth boxes, until you mouseover or focus it (but all of the boxes should
appear light red in background for http, and light green for EV, even if you
can't see the text in them), and the last one should be far from the one
before it, to avoid e.g.:

    
    
        [https] [www.example.net] [example.org] [foo.html]
        [https] [www] [example.org] [www.example.net/foo.html]
    

(It would be easy to accidentally think you were somewhere at example.net with
both of the above, even though you're really somewhere at example.org)

Clicking on any box (or the regular Ctrl+L) could turn it back into one box
(for easy URI copying) and defocusing it will revert it again. Power users
could set a knob to simply always display the 1 bar they've been looking at
for the last 25+ years.

Maybe there could even be a conditional 5th area for the query parameters (GET
variables) which isn't even shown by default (without input area focus), who
knows.

    
    
        [https] [news] [ycombinator.com] [reply] [id=19032043&goto=item%3Fid%3D19031237%2319032043]
    

Just my wild 4am ideas... probably lots of things wrong with it I can't
imagine right now.

~~~
Matheus28
I'd personally invert the order of the 2nd and 3rd areas. Yes, it'll look
ugly, but it's way easier for users to parse for phishing:

[https://example.com.phishing.com](https://example.com.phishing.com) ->
[https] [phishing.com] [example.com] [foo.html]

~~~
murbard2
You can go Big endian all the way,

[https] [com] [phishing] [com] [example] [foo.html]

------
JohnBooty
Great example of why we need diverse browser culture, instead of a Chrome
monoculture.

When we hand Google a browser monopoly, we hand them de facto authority over
_everything_ related to the web.

It will effectively be up to a _single for-profit entity with questionable-at-
best motives_ how we see the web in fundamental ways.

That's not to say the work they're doing here is bad. Might be good. And it's
a long ways from production. But that's besides the point.

~~~
99052882514569
Exactly. Having recently switched from Chrome to Firefox, I'm puzzled by
language such as "Google is thinking about killing the URL". Nope, still
featured prominently in my address bar, protocol and all, and Google has zero
say about whether it stays that way.

------
deanclatworthy
I don't see the outrage here. Using heuristics to determine the likelihood of
a URL being fake, sounds like a good idea as long as it's weighted against
false positives.

That said, I've never understood why browsers do not highlight the hostname
separately from the path. Many phishing domains are of the form:
google.com.auth.something.else.realistic.looking.tk/fake-path-stuff and are so
long that the user just sees google.com and moves on. Something as simple as
underlining the hostname or making the path a slightly lighter hue would be a
huge usability improvement in being able to stop phished hosts.

~~~
Leace
Firefox does highlight registered domain "looking.tk" in white font and the
rest of the URL is colored gray
("google.com.auth.something.else.realistic.looking.tk" and path).

~~~
Sujan
For me it is black for the domain vs. dark grey for the rest (FF65). I
actually never noticed this before, nice!

~~~
aembleton
I think its dependent upon your theme. For example with a dark theme it is
like so : [https://i.imgur.com/bwsyhjN.png](https://i.imgur.com/bwsyhjN.png)

~~~
Sujan
Oh of course! Much more pronounced there.

------
avanderveen
Changing the UX for URLs in a browser that has overwhelming market share is
going to change how people think about and assess the identity of websites
they visit.

Over time, it could make other browsers feel less familiar, old fashioned, and
maybe even shady for most people.

It may end up improving security for some (we'll see), but it may also improve
the security of Chrome's market share (whether that's the motivation behind
the move or not).

------
ccnafr
The article talks about showing warnings when accessing misstyped URLs, not
"killing URLs."

Nothing in the article I've read suggests they're doing anything of the kind.
What a bunch of clickbait bs.

If you search this HN comments page, you'll find 3-4 other people claiming the
same thing.

This article is an insult to news reporting.

~~~
jesseryoung
Yes, I was very confused by the title as well. The introduction briefly talks
about "changing the way site identity is presented" but after that it talks
about just flagging suspicious URLs. Nothing in there about actually "killing"
them.

------
maz1b
There HAVE to be underlying motives at play here. If Google really cared about
"dumb users" – I agree phishing and such attacks do pose a threat to the
average net surfer, but surely they can use ML to bolster and enhance the
warning or safe browsing system that they already use, and use that as an
additional plus point or marketing point. To imply that this is being done
merely for the sake of the "user" is laughable.

This is simply a red herring to (eventually) guide user interface behavior
towards a system or set of systems that moves towards obscure, non-transparent
and centralized control.

------
dreamcompiler
Very misleading headline. The article does not say Google wants to kill the
URL; it says Google wants to take steps to make sure users can more easily see
what domain they're on and make it harder for scammers to spoof legitimate
domains.

~~~
hoffs
It wouldn't be good journalism if the idea of the change wasn't all twisted to
some conspiracy theory.

------
eleitl
This is why I stopped using Chrome years ago. My current Firefox is configured
to display the protocol along withe the FQDN, without any munging
thankyouverymuch.

Google is self-isolating into a walled garden. Good riddance.

~~~
degenerate
All of these "help the dumb Chrome user" additions to Chrome _really_ need a
way to be turned off. All the Chrome team needs to do is have a Developer
Options section, which they already do in Android, that allows us advanced
users to undo all the asinine changes to the browser over the past 2 years:

\- OS-style scrollbars were removed [1]

\- Backspace no longer goes back [2]

\- Can't click on the "Lock" in URL bar to see certificate info anymore

\- Tabs were redesigned, and take up more space

\- Extensions are no longer a simple list, but are now gigantic unnecessary
"cards"

\- "Chrome Web Store" link from Extensions section is now hidden underneath
hamburger menu

\- Half the themes for the old design are literally broken now

Those are the biggest buggers for me, and Google simply throws up the middle
finger <^> to advanced users and expects us to understand these changes are
better for everyone. _NO they are not!_ Let us turn these "features" off in
Developer Options.

[1][2] There are extensions that can re-enable these top two removals. The
rest cannot be changed.

------
everdev
> while URLs may not be going anywhere anytime soon

Nothing in the article talks about killing the URL, just flagging spam URLs
like G00GLE.com (with zeros) that might be security risks.

~~~
arkades
>"What we’re really talking about is changing the way site identity is
presented," Stark told WIRED. "People should know easily what site they’re on,
and they shouldn’t be confused into thinking they’re on another site. It
shouldn’t take advanced knowledge of how the internet works to figure that
out."

> And while URLs may not be going anywhere anytime soon, Stark emphasizes that
> there is more in the works on how to get users to focus on important parts
> of URLs and to refine how Chrome presents them.

I read that as "Google is working on decoupling site identity and navigation
from URLs."

~~~
muthdra
Because scammers are exploiting the way we use URLs for the detriment of
users. URLs are great as a mnemonic point of entry but you can easily scam
someone out of a trusted website into your cloned version of it without them
even noticing; it's a very slow and very dumb attack where the shady URL
itself is the corrupt man-in-the-middle.

~~~
arkades
I understand the purported rationale for killing URLs. My comment was
clarifying where the google spokesperson made a statement I interpret as “we
are going to kill URLs.”

------
jpeg_hero
lost in the discussion is that major brands spend millions buying their own
name back from google.

see an ad on tv and then type in "progressive" into the omnibox and click on
the first link (paid ad by Progressive Auto Insurance on its own brand) and
google gets $10+

type in "progressive.com" and google gets $0

~~~
freedomben
A bit conspiratorial and counter to Hanlon's Razor, but an interesting and
reasonable theory. But $10 for one click? How do you know that?

~~~
turtlecloud
The click price is open info.

a quick google search on keyword pricing would send you to keyword planner
that will show click price.

If you think it’s fake news just bid on the keyword yourself.

It’s a bit offensive to claim conspiracy when facts don’t align with your
understanding of the world.

~~~
freedomben
> _If you think it’s fake news just bid on the keyword yourself._

I don't see where I even implied that it was fake news. In fact, I inquired as
to where one could find this information.

> _It’s a bit offensive to claim conspiracy when facts don’t align with your
> understanding of the world._

I explicitly called it: _" an interesting and reasonable theory"_ lol. "A bit
conspiratorial" does not mean "false" or fake news. And what _is_ my
understanding of the world then? I assume you know since you have suggested
these facts don't align with it. I disagree, in fact these things align very
well with what I consider to be my understanding of the world, but perhaps you
can enlighten me.

------
cdnsteve
Firefox and Duckduckgo here we come! Recent Google/Chrome changes have me
baffled. It's a great time to switch. Vote with what you use folks.

~~~
mancerayder
This combo works fine. Also, using DDG as your default search engine gives you
more diverse results, because you can also just do g! prior to the search
terms to automatically send to search to EvilCorp --- oops, I mean Google ---
as needed.

------
taeric
Hard to call these the first steps. The entire point of the omnibar is to blur
search and address.

------
marcus_holmes
This is to protect dumb users. Can't we train them instead? I'm getting fed up
of everything in my life being dumbed down so that idiots don't hurt
themselves with it.

Can we put up a "you must be at least this willing to learn stuff in order to
use this internet" sign on the information freeway on-ramps?

~~~
IshKebab
People aren't dumb just because they aren't experts on URLs.

~~~
marcus_holmes
You're creating a false dichotomy.

I am defining "dumb" as "incapable of determining what url they are visiting
and therefore vulnerable to scams". That's a looong way from any sane
definition of "expert".

~~~
neolefty
Am I dumb to not open the can of oil I put in my car and check whether its
heavy metal content is too high to be fresh oil? It's just not practical — if
Google can mass-check things for me with TrickURI, it will very likely protect
me (a professional programmer) against phishing. Let alone non-web
professionals who are nonetheless experts in their field.

I'd rather my doctor and car mechanic and grocer get to focus on their areas
of expertise than have to learn some baroque rules about links in their email.

~~~
marcus_holmes
I have no idea about the car oil thing. I don't own a car and rent one when I
need one. All aspects of the maintenance and care of any car I might be
driving I leave to its owners. Which is not that far from what Google are
proposing - that someone else (Google) takes responsibility for the machine
we're operating and makes sure it's safe.

The problem, of course, is that this trains us to be incapable, and leaves us
incapacitated if anything goes wrong. If my rental car breaks down I have no
idea what to do except ring the rental company and hope they can send someone
to fix it. Likewise, if Google's filter makes a mistake (which it will) then
the user has no ability to make any kind of decision on their own. They'll
click on the fake bank, lose all their money, and whose responsibility will
that be? Google won't pay them back - they just provided a free tool. The bank
will want to shift responsibility ("you must have done something unsafe,
Google stops all phishing attempts, so you must have told them your login
details"). The net result is that while most people will be safer, some people
will be in a worse position than they are now.

It doesn't solve any problems for anyone, it just makes us helpless if there
is a problem.

------
48309248302
Showing URLs is essential. Google isn't able to determine what sites are safe.
You can test it by signing up for a lot of Google Alerts. I get fed many shady
URLs in Google Alerts and have clicked on some of them accidentally because
the URLs aren't displayed in the emails anymore. If the URLs had been clearly
displayed, I wouldn't have clicked on them.

A tool that warns people when something appears wrong with a URL would be
useful, but hiding URLs from users would be a terrible idea. It will create a
generation of people who don't even have the capability to visually scan a URL
to see if it's safe.

Technology shouldn't be dumbed-down for people so much that many people who
are capable of learning how it works will never see enough of the details to
become curious. I've even met professional web developers who don't understand
URLs well due to the current URL trimming in most browsers.

------
mooman219
I'm a bit confused on when the URL checking is applied. Is the browser going
to maintain a list of "good" domains and validate against that? If so then
that'll need a lot of maintenance. Getting yourself on that list would need to
go through some process that probably isn't transparent. If your legitimate
website is too similar to one on the known "good" list, then it'll also
probably be a process to get that resolved too all while you're losing
visitors. If you attempt to automate this then there's a new risk that someone
gets a malicious site in.

~~~
wmf
The existing safe browsing system is more like a blacklist and it sounds like
this new system will be similar. [https://blog.chromium.org/2012/01/all-about-
safe-browsing.ht...](https://blog.chromium.org/2012/01/all-about-safe-
browsing.html) But if the machine learning model is small enough one could
imagine building it into the browser.

------
tokyodude
Safari already does this to some extent and as a dev I hate it. I have 2
jsfiddles open in separate windows. The URL area just shows "jsfiddle.net" so
there is no way toa see if 2 identical looking fiddles are the same or
different nor which was is the fork and which one is the original.

~~~
sharkjacobs
I might be misunderstanding something (not familiar with jsfiddle) but if your
problem is that you can't see the full URL you can change that at
Preferences>Advanced>Show Full Website Address

~~~
saagarjha
Not on Mobile Safari :(

~~~
kkarakk
iPad is TOTALLY a valid laptop replacement!

~~~
saagarjha
It is for a lot (dare I say, most?) people.

~~~
kkarakk
excluding programmers or content creators - the people that apple is trying to
push away from macs to ipads

------
ajvs
Solutions: True Sight browser extension to identify CDNs like AMP[1].

It shows an icon in the address bar to inform you whether the website is in
part or wholly being hosted by CDNs (eg Google, Cloudflare, CloudFront, etc).

You can block partial CDNs (eg scripts, images) using NoScript or uMatrix, for
full webpage CDNs like AMP you'd have to observe the True Sight warning and
navigate away manually at the moment.

For partial CDNs it's also worth noting the Decentraleyes extension, which
loads popular resources from it's offline source rather than the CDNs.

[1] [https://addons.mozilla.org/en-US/firefox/addon/detect-
cloudf...](https://addons.mozilla.org/en-US/firefox/addon/detect-cloudflare-
plus/)

~~~
neolefty
That's useful, but it's not what this article is about.

This article is about a tool called "TrickURI" which detects misleading names
like "G00gle.com"

------
wyck
Google continuing to undermine the founding principles of the web for that
sweet sweet ad revenue, a very minor part of the actual web experaince for a
dwindling population of ad clickers. Digging a hole straight down sugarcoated
with BS and delusion.

~~~
Jedi72
Its all about control. They want to own you.

------
prestonpesek
This is an attempt to seal off one of the main pathways out of interaction
with its monopoly. “This will make you safer” is a common refrain of those who
seek to maintain control over everything you do.

~~~
neolefty
The actual work doesn't seem to be at all — it's about a tool called TrickURI
that identifies names that are visually confusing such as "G00gle.com". The
headline is clickbaity.

So if I go register a name "neolefty" that nobody has ever heard of, TrickURI
isn't going to object unless there's a fantastic service out there already
called "ne0lefty". Which sounds legitimate to me. Kind of trademark-law-y, but
consensus is how language works.

------
skybrian
Headline is misleading. This is about Google's research into how users are
misled by tricky URL's and what to do about it.

~~~
imhelpingu
The article is about how Google PR people are now openly advocating for
removing the concept of a "URL" from web browsing for security reasons. What
are you talking about.

~~~
dang
Please don't spoil a perfectly fine comment by putting in a swipe at the end.

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

~~~
imhelpingu
This is the second comment I've seem where you being a hallway monitor is far
more disruptive and annoying than whatever you're responding to. FYI making it
clear that I'm being contradictory is better than a passive aggressive
declarative statement, which is a position I shouldn't have to explain, but
since you're apparently so concerned about how other people communicate, there
you go.

~~~
dang
I'm sorry it's annoying. Unfortunately, you have a history of incivility on
this site. Comments like "what are you talking about" have no purpose other
than to add hostility, and other comments of yours have been similar (e.g.
[https://news.ycombinator.com/item?id=19004244](https://news.ycombinator.com/item?id=19004244)
and below). That's not cool, and if you keep it up we're going to have to ban
you again. So please just use HN as intended, and then we won't have to annoy
you with moderation.

Among other things, that means posting with respect for fellow users,
regardless of how wrong you think they are.

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

------
keyle
Validate and confirm that what you're looking for is good, yes.

Obfuscate what you're looking at, no.

------
aboutruby
The only new bit of information I learned from this article is Google's
Trickuri project:
[https://github.com/chromium/trickuri](https://github.com/chromium/trickuri)

It contains all sorts of interesting test cases to test how various URIs are
displayed in various parts of the UI.

------
chimen
You only need a threat to introduce any kind of "safety measures". This is a
trend that is done by governments all over the world as an excuse to enter and
take, do surveillance and monitor all kinds of activities. It's no surprise
that they always put safety in front of measures designed to align with their
marketing agenda. The masses will agree unfortunately and blindly subscribe.

I'm still waiting for them to find a "safety-first" reason to ban adblock from
the store.

------
prepend
Do you know what is not confusing at all? And is also fraud proof? AOL
keywords.

I always wondered who Google would be, but I never guessed AOL.

This is a terrible idea, but I suppose it is easier to organize the worlds
information this way.

~~~
seanp2k2
Came here to post this; they're re-inventing AOL keywords. It was bad then and
it'll be bad now. Chrome is user-hostile.

------
lapinot
Besides the fact that this article is a clickbait, "killing URLs" is a
actually something that is happening. I'm surprised nobody put this in
relation with the fact that google has a search engine, something much more
important to this topic than the fact that it has a web browser. Hiding URLs
and making them 2nd class citizen means nothing alone. The reason it is done
is to promote an alternative way of accessing/finding things, with the goal of
controlling how people move/navigate (since this is directly translatable to
ad-money and behavioral data collection). Just think about every single ad-
based (eg lock-in based) social network/service/web-app: they want you to
navigate through _their_ searchbox and result page, because they want to
control what you see and keep you from going away.

So killing URLs isn't much about this particular action from google or about
web-browsers, it's actually embodied by the much broader trend towards
searchbox-centered ui and "related stuff"-based navigation (instead of an
absolute classification system like trees, tag hierarchies etc). Please f*ck
off with your search engines and recommendation systems, just give me some
tools to build taxonomies so i can organize myself. I know what i want and i
know how i want it classified. IT was meant to organize (and process)
databases please just stick to that, i want a library not a bookshop.

------
onetimemanytime
Exactly! No need to remember websites anymore, just go back to google, search
again and click on some ads while you're there. Sorry, but I don't trust
Google or their self-serving proposals at all.

------
kmlx
for everyone that says or implies that “google is out to get us”:

we have had users that thought the small green https lock next to the url
meant a shopping site (the lock resembles a purse). so seeing it meant “safe
shopping” by google.

users are much less knowledgable than anyone here can imagine.

------
z3t4
Shouln't web certifacates be supposed to solve the identity issue !? It's not
the first time Google tries to kill the URL though. The reason why Google want
to kill the URL is that they want to replace it with a Google search bar, and
that instead of typing in a URL you make a google search. Google pays browsers
_a lot_ of money in order to make the url bar work that way.

------
pvorb
Why not simply show identicons that represent a hash of the domain you're on?
[https://vorba.ch/2018/url-security-
identicons.html](https://vorba.ch/2018/url-security-identicons.html)

~~~
Shorel
I don't think simply and identicons belong to the same sentence.

~~~
pvorb
I just watched the talk "don't say 'simply'" the other day [1]. You're right,
"simply" is wrong here.

[1]: [https://jameshfisher.com/2018/09/13/dont-say-simply-
writethe...](https://jameshfisher.com/2018/09/13/dont-say-simply-writethedocs-
prague.html)

------
EamonnMR
This has been a long time coming. Wacky things like '[https://'](https://')
awhere never going to fly with the general public. Though tech support for
them is going to get a bit harder now.

------
imhelpingu
"Google Takes Last Steps Towards Killing the Open Web by Making Everyone
Register With Centralized Certificate Authorities Just to Run an HTML Server."

I sincerely hope I live to see this quintessentially evil corporation go
bankrupt.

------
dangerface
> the complicated part is developing heuristics that correctly flag malicious
> sites without dinging legitimate ones.

Its not complicated its impossible, google just don't care if they kick other
people off their internet.

------
juststeve
So, google would become an authoritative source of all websites if this goal
is achieved?

are they proposing that DNS standard is "updated" so that every request at
runtime must verify that a website is legit?

------
exegete
The TrickURI tool mentioned in the article:
[https://github.com/chromium/trickuri](https://github.com/chromium/trickuri)

------
PretzelFisch
>"Killing the url" The title is a little clickbait, I guess. Google only
discussed displaying a warning when a url looks phishy. Or did I miss the
killing/hiding url part?

~~~
j16sdiz
They have been doing it for awhile. Removing [https://](https://), removing
the padlock, removing www., Hide the path...

~~~
Klathmon
Https still shows up for me, as does the padlock, www is still showing as well
(that change was fairly quickly reverted after the feedback they got).

------
t12ya
Any large software project is taken over by bureaucrats if all basic issues
have been solved by the real programmers.

Chrome introduced garbage like rounded corners, now this.

It's the Gervais principle in action.

------
aouk
since the problem is with "careless" users when it comes to "noticing" or
"reading" a URL, which have a position in the screen that I suppose most users
know by now, then, I think that the solution should be designed around UI/UX
principles, since one of the purposes of UI/UX (correct me if i am wrong) is
to let and help the user "read" and "notice" every bit of information using
every pixel in the screen.

if the goal is to help the user "notice" and "read" URLs, then why not just
(for example):

-use another font for URLs, one which mitigate the similarities between characters

-make the URL bar bigger with a bigger font

-use a small popup (à la Wikipedia) that shows the URL clearly when the user hover it. Actually something happens when you hover a link in some browsers, a lost, grey container with a black font appears in the bottom left of the window, it contains the URL of the link.

but if the goal is to setup "yet another input device" for a database of shady
URLs and domain names, well... you will surely need to integrate not only with
the UI/UX parts of the browser which have the greatest usage share.

or if the goal is to hide one important part of the mechanics of the web (and
make software feels even more magic...) then, don't show the URL at all.

------
walterbell
There is an AMP proposal for certificates to replace DNS for web site
identity. In that scenario, Chrome would stop showing URLs and would only show
the human-readable identity that signed the page, as validated by a
certificate. This would allow signed site pages to be navigated offline or
replicated widely or blacklisted.

Demo video & more info:
[https://news.ycombinator.com/item?id=17923156](https://news.ycombinator.com/item?id=17923156)

~~~
Eridrus
I don't really want to drill into what's happening with AMP, but I am
skeptical of attempts to move from domains to some concept of real world
entities with signing. We tried that with EV certs, but it turned out pretty
easy, albeit expensive, for bad guys to get EV certs.

------
nakodari
I am skeptical that they can kill the URL. It's like those smart inbox
startups that decided to kill email a few years ago but they ended up shutting
down and email has become a stronger and more important tool in communication
now. Some things don't need to be replaced or killed. If it works, it works.
URL is the foundation of the web and killing it would be more difficult than
killing the email. Just my 2 cents.

------
Causality1
Frankly I'd rather use a browser that blocked all URLs with non-ASCII
characters than one which didn't show the plain URL in the address bar.

------
systematical
Maybe I missed something, but I didn't see anything about removing address
bars or promoting AMP in there... Then again its late here.

------
lvs
It's it convenient that Chrome's moves against the URL also makes it easier
for Google to MITM/proxy the whole world using AMP?

------
throw2016
This is a classic example of 'manufacturing' a problem so you can 'solve' it
in a self serving way.

At some point technical discussions will have to rise above current naivette
and various parroted 'laws' and demonstrate understanding of the real world,
corporate and human behavior instead of lamenting after the horse has bolted.

~~~
wyattpeak
While I'm no great believer in their particular solution being motivated by
pure altruism, it's simply false to suggest that the problem is manufactured.

Telling whether a site is being served by who you think it is is fast becoming
a crucial skill, to the point where I explain at least the basics to even the
most technically illiterate people I know who use a computer. It's a real
problem, in substantial need of a solution.

~~~
bad_user
Why can’t the solution simply be education?

I don’t buy this learned helplessness. Schools need to evolve with the times.

We teach kids calculus in school, surely we can teach them to read a freakin
URL.

------
dandare
Why don't all browsers simply visually emphasize the TLD and second level
domain (or third level in a case like example.co.uk) in any given URL?

Yes, sometimes it is not easy to recognize the real address in a scam URL like
"dropbox.com.scam.ru" but browsers could make absolutely clear what the TLD
and SLD are.

~~~
sricks3
The version of Firefox I'm on right now does. Most of the current URL is a
light gray, while the "ycombinator.com" portion is white which gives much more
contrast against the dark gray background of the URL bar. (I'm using the dark
theme, but I'm pretty sure the normal theme does this, too.)

------
xg15
I understand their motivation, but in none of the articles murmuring about
them "taking away the URL", I have never so far seen any concrete approach how
their replacement could look.

Even if you are willing to sacrifice the "you can write them on a napkin and
share them with everyone" feature, it's not clear to me what other identifier
would fundamentally solve the identity problem:

Even if you forced every website to get an extended validation certificate
from a preselected CA and then based website identity solely on the
certificates, what would stop you from registering a misleading company name?
(There are precedents for that, btw. Search for "World Conference Center Bonn
Scandal" if you want to read some hilarity)

Additionally, as the article mentions itself:

> _The big challenge is showing people the parts of URLs that are relevant to
> their security and online decision-making, while somehow filtering out all
> the extra components that make URLs hard to read._

I feel the approaches we have seen so far rest on the assumption that the top
and second level domains of the hostname are the only "important" parts of an
URL and the rest can be hidden. I think this assumption is simply false, even
for a vast number of non-technical use-cases: Often, "identity" is not just
about the organisation behind an URL but also about the content - e.g., you'd
like to know which article of a blog a link leads to.

More importantly, many sites are divided into user profiles, where the
identity of a user is given by a subdomain or a path segment. Just knowing
you're on "[https://facebook.com"](https://facebook.com") doesn't tell you
whether you're viewing the actual profile you want to view.

Finally, even the "cruft" is sometimes important, if only for knowing it's
there. E.g., I frequently remove tracking/referral arguments before sharing a
link - both to make the link easier to remember and to disrupt tracking.

Also, unrelated:

> _The Chrome security team has taken on internet-wide security issues before,
> developing fixes for them in Chrome and then throwing Google 's weight
> around to motivate everyone to adopt the practice_

Is that how we imagined internet governance to work? Didn't we have standards
bodies like the W3C or the IETF that were supposed to make decisions on that
scale?

~~~
majewsky
> Search for "World Conference Center Bonn Scandal" if you want to read some
> hilarity

FYI: I tried, and the only Google result is this exact HN post.

~~~
xg15
Ah, apologies. I guess my optimism was too great that this made more than
local news.

I can't find an english-language article about the story, so here is a german
one:

[https://m.dw.com/de/der-bauskandal-um-das-world-
conference-c...](https://m.dw.com/de/der-bauskandal-um-das-world-conference-
center/a-16808295)

To summarize the story:

Bonn used to be the capital of West Germany during the Cold War. When that was
over, however, Berlin got reassigned as capital and Bonn went back to being a
mostly ordinary small town.

They never quite got over the demotion though and the city made numerous
attempts at staying internationally relevant. One project was to become a UN
base of operations for Germany. Apparently for that, it's a requirement that
your build an oversized hotel and conference venue.

The city had trouble finding investors for the project, but eventually a
korean company - "SMI Hyundai" stepped forward.

If you want to believe the official records, then apparently due diligence
went out the window at the mention of the name "Hyundai". City officials
assumed that they were somehow affiliated with the automaker and were quick to
trust them with city-backed loans in the millions.

It turned out they were scammers and not even remotely capable of contributing
to the project. In the end, the city had damage of several 100 million euros.

The company had never had any relation to the automaker either. It just
"happened" to have the term "Hyundai" in its name...

------
auslander
I feel the end game is that all websites will be served to you by Google, like
an invisible proxy. And you wouldn't be able to see it. Google search already
modifies the result links with google ones, to see which result you clicked.
AMP plays to that goal too.

------
boztek
Reminds me a bit of the "URL"/transition component of Doug Crockford's post-
web secure distributed app delivery idea.

[https://youtu.be/O9AwYiwIvXE?t=2400](https://youtu.be/O9AwYiwIvXE?t=2400)

------
t0astbread
This is a worrying development. It seems like this tool would help endangered
users but also penalize legitimate sites/services. Not to mention, this is
Google shaping the internet to its needs again.

Also, the title is clickbait so now I feel bad for upvoting.

------
rwmj
A better idea might be to allow EV certificates to contain a large logo which
is displayed next to the URL bar, and ensure that only trademarks can be
placed by their holders into these certificates.

------
footprintcomp
Wouldn’t displaying the top level domain name in bold or another colour make
identification easier, or am I missing something?

------
sunseb
Yeah folks at Google want their search engine to be the entry point for the
web - not the url.

We need to find (and promote) alternatives to Google (and Facebook). They are
making the world a WORST place.

------
macca321
Google should never have given pagerank rewards for 'meaningful' URLs, they
should have given it for meaningful link descriptions.

URLs should be opaque, then we wouldn't have the mess of people trusting them.
Also we would have HATEOAS instead of OpenAPI :D

------
anuraj
So by obfuscating data from users - Google is salvaging Internet?

------
_pdp_
The way they killed all their other useful products?

------
Shorel
My wild guess, based on events this week:

The step zero has been partnering with MS to kill Firefox.

------
marssaxman
can we just kill Google instead?

~~~
dang
Please don't post unsubstantive comments here.

------
trumped
kill AMP first, then talk about that, Google.

------
RcouF1uZ4gsC
Sounds like Google is re-inventing EV Domains. EV Domains attempt to tie a URL
to a business securely. Unfortunately, nobody pays attention to EV
certificates.

Now we will be getting a proprietary Google replacement of EV Domains.

