This is not about analytics. This is just an experiment to test if using ajax instead of the typical form submit can make the result page load faster.
Here's an official statement -
We’re continually testing new interfaces and features to enhance the user experience.
We are currently experimenting with a javascript enhanced result page because we believe
that it may ultimately provide a faster experience for our users. At this time only a
small percentage of users will see this experiment. It is not our intention to disrupt
referrer tracking, and we are continuing to iterate on this project.
Many of the comments seem to assume that websites are entitled to know what I searched for. I think that it would be a great step to give millions of people a little more privacy, even if it's just an accidental side effect of a change to Ajax.
I feel that the benefit in letting webmasters know what their users are looking for outweighs any privacy gains to be had. If people are that concerned about their privacy they could always set their browser to not send the referrer. I doubt that this move has anything to do with concern over user privacy.
making it about privacy is playing an emotional card when it should really be a technical discussion. Hiding a query string isn't "privacy" in any meaningful sense. It's beneficial for you to tell a webmaster what it is that you have come looking for.
I'm in two minds about the issue though. Google don't have a contract to supply the searchterm on outgoing links (via the referrer or otherwise) but for many applications it was a pretty useful thing for them to do. Arguably their only resource there is a single search page which is dynamic depending on your query string - to me this says ?q=searchterm is the semantic way to represent it, but hey it's their application so they can "design" it however they want. On the other hand, urls are about as explicit as you can get on the web and google's are more publicised than most. Changing your public API is asking for trouble.
It's beneficial to the webmaster, not the user. If most people were presented with a choice of whether or not to provide this information, they'd decline.
It's win-win. If a webmaster knows that traffic is coming in for a specific term and how long that traffic sticks around, they can decide whether they have sufficient content there to keep those users happy. If not, they can add content or tweak the site, which benefits the user.
If a user comes across a site because of a google search, it's more likely than not that he's there for a specific purpose and does not plan on returning. If the webmaster is able to glean useful information from his behavior and use that to improve his site, then it's possible that users in general are better off, but the original user is long gone.
You walk into a store, can't find what you were looking for and leave without a word. You keep doing the same for a month, and keep blaming the store for not stocking enough items. Wouldn't just telling the seller what you're looking for benefit both of you?
That analogy doesn't really work - if they came to your site from a particular search term it would seem likely the content is already there.
Even if people are reaching a site based on a mismatched keyword, I wouldn't have thought many people check their logs and update their site because they're getting a lot of hits from a particular term they don't have much for?
True. I was kinda looking at it from my own perspective. My blog usually gets hits from people looking for "college education india". I can retarget my content towards college students, put up a few AdSense ads and make some cash. Of course I wouldn't do that, since my blog is strictly personal, but I think it's a viable strategy to drive more content towards your site.
I would even support this argument if google wasn't evil and wouldn't store personalized search logs themselves. Then this would indeed be a privacy win.
The way things are now this smells suspiciously like an attempt to further monopolize useful data that were available to webmasters before and kicking out competitors in the web analytics space. You might argue (with good reasons) that these data should have remained private from the beginning, but then google should stop logging them first and only then remove them from the results. The other way round it's "do as we say, not as we do."
No, they're not. It's not like they're deliberately obfuscating the information in order to shut everybody else out. They've made a technology decision which will probably significantly speed up searches and result in increased revenues for them. Google shouldn't be beholden to other people's business models.
That said, given the amount of referrals that Google generates, and how important referral information is for a lot of things online, as a courtesy Google should probably communicate this to the web-at-large before rolling it out on a massive scale. However, they're not "cutting off" other search analytic packages. At worst they're forcing a rushed bugfix.
They aren't obfuscating the information, they're removing it altogether. As an analytics provider, you cannot 'bugfix' a lack of information. You're just hosed.
Probably what would have to happen is for the browsers to allow for passing data after # symbols the same way they do after ? symbols. That means updating every major browser to support this. At the speed of browser development, many analytics providers may be out of business by the time that happens.
Yes they are cutting off other search analytic packages. If there is no tracking of the search term in the referring url then the analytics package has no way to determine what keywords were used to bring a user to the page.
Since google provides one of the most commonly used analytics packages, Google Analytics, and owns another, Urchin, I would think that they would be the last ones to try this.
I think that Urchin was the basis for GA. Urchin is still available from google here though http://www.google.com/urchin/. The main difference that I know of between them is that Urchin has the ability to parse server logs as well as track users through javascript.
Ahh, I stand corrected. This is what I get for only reading half the article.
Yeah, sucky problem. I guess we'll have to wait for Firefox 5 and IE 10 to have support for hashtags in referrals.
EDIT: My point still stands that Google shouldn't be beholden to others for their technological/business decisions. Though of if it's deliberate it certainly tightrope-walks the "Don't be Evil" line.
Hashtags in referrals will not happen. That would break a great deal of things, considering an unencoded "#" in the URL is not expected by nearly any server side scripting language. It would probably be treated as part of last GET parameter.
Who is this a problem for? It's going to improve search results because now SEO gamers have less data to work off of. If you're a website, you should know damn well what keywords your website attracts... and you shouldn't be tweaking things around just to trick Google into sending you more traffic.
Now web sites can focus on creating content, and google can focus on creating good search. Both parties will benefit at the cost of spammy SEO gaming websites, who are now blind and will be pushed to the bottom.
Google is a business who wants you to use their analytical service over others. They have no obligation to you send you the information their users are using to find your site.
Google is also arguably close to being a effective monopoly in web search. Antitrust law prohibits using monopoly power in one market (eg search) to acquire a monopoly in another market.
So there's a bit of a bind for Google: strategies that were perfectly legal when they were up-and-coming become illegal at some point of dominance. But there's no 'bright line' for when the rules change -- market definition, thresholds, and acceptable practices can all be argued after the fact.
A naturally competitive company is almost certain to keep the 'pedal to the metal' until it's too late.
> Shouldn't Google push for dominance as far as it possibly can?
If they want to live up to their 'Don't be Evil' motto then they should do it with better products and services not by cutting others down. In the long term I think that will work out better for them.
> Isn't it some other entity's role to stop them?
You could also run that argument for a gangster or any type of criminal I suppose.
A change of this magnitude is not to plug their analytics service. It's to improve search results by cutting off the feedback they're currently giving to SEO gamers. Read my other comment below.
I can't really see how google can easily pass the search query to Google Analytics. GA requires atomic data to perform all this analysis and custom segmentation magic.
The only way I see is using a cookie (with a user id) on the google.com domain, which ga.js must interpret in backend on every request. But this means that ga.js will have to be dynamically generated, rather than being a simple static file (and yes, it won't be cached by users visiting your site, making tracking even slower). I don't think google wants this additional load every time anyone loads some GA-enabled page on the web. Besides, ga.js is currently hosted on google-analytics.com, not google.com, so all users will need to upgrade to a new tracking script. And it will not be possible to self-host the script anymore.
Anyway, this does not sound like anything reasonable to me. I hope it won't be implemented - it would destroy an entire industry, and not a redundant one.
You go to Google and do a search, Google stores your search sessions on the server, and assigns you some unique ID. If you click on a search result, google logs that in your session.
You visit site with Google Analytics. ga.js reads the Google.com cookie to get your session ID in some clever way (perhaps embedding an invisible iframe in the Google.com domain, which can read the Google.com search cookie). Now, Google Analytics knows that you're the same person who just clicked on that search result .2 seconds ago... a match in my book.
They could also tell you things like "visitors to your site also search for XXX" but that's clearly a violation of privacy. They get this data, but we don't.
Yes, but in that case other web analytics packages will also have access to referrer that contains the search query and there is no problem.
I wasn't clear enough, sorry - I was concerned with how Google could be able to share the search query _only_ with GA but not other analytics tools, because it is clear that Google will not sacrifice GA usefulness to some ajax SERPs.
I might be missing something, but I think Google can easily modify their redirect script to store the data they want (for use by Google Analytics) and then strip that data out before it hits the target URL.
See how there's a hash mark # in there now, and the "q=test" is after it? The problem is that web browsers don't send anything after the # in the referrer string.
I wonder why they don't send it in referrer string. Are there any technical or security reasons?
Originally the content after the # was only used for automatic scrolling to particular tags within a page, so there was no need to ever send it to a server. Once Ajax came around, people realized you could use javascript to read the url after the # and do arbitrary stuff based on it. So you can show different pages at
foo.com/
foo.com/#bar
but since the server doesn't treat them differently, when the user transitions from one to the other, there's no page reload. So you can move from page to page, but cut n pasting links still works.
There are libraries like RSH that let you do this to speed up your site.
Seems to me that browsers ought to start including the fragment (the part after the #) in the referrer, now that different fragments can refer to entirely different pages. I can't see any security issues, as the previous poster was wondering.
There would be security issues if any existing website has put confidential user data after the #, expecting that no other website would see this in the referers. This doesn't seem typical, but still I'm sure some websites would have security issues if this was changed.
Also, practically speaking, even if all browser developers agree this was a good thing, it takes quite a long time for people to upgrade their browsers.
They don't owe you that information, sure it's nice I guess...
In any event, the only way I can think to do this with an ajax search is to pass every outgoing link click through a redirection script, so that the referrer is rewritten to something useful.
However, whilst improving things for webmasters, that would seriously negate the point of using ajax, and be a pain for end users.
Google are serving their end users. I don't think trust comes into it TBH.
What would you rather google do? not progress and use ajax search, or use a redirection and make a worse experience for end users?
Unless I'm way off, Google is already doing a redirection script on all outbound clicks. The endpoint url shows up in the footer of the browser on hover, but if you right-click and copy the link location, you get something like:
That doesn't affect the referrer though. The referrer is still the link where the click came fromo, which woulld still be e.g. google.com/search?q=whatever
The redirect script has been tehre for as long as I remember. It's probably tied in with Google Analytics.
The donothing=true tells Google's webserver to perform no actual query and return a blank page. This should happen extremely quickly. (It could be performed after the ajax results have been fully loaded as to not affect response time)
Then when a user clicks a search result, Google does a javascript redirect from the iframe (use javascript to add a new <script> tag with a window.top.location= in it) and you get a referer just like you would before... with a negligible performance hit.
Really? I mean how much should they hold themselves back in order to maintain support for other products?
I know they're huge, but it's still a free product/service, they shouldn't have to retard their progress to avoid breaking other products. If they decided to stop supporting IE (while potentially bad business) I'd support their right to do so.
Yes they have the right. No they don't owe me anything (save the service I pay them to provide, which is moot to this discussion).
But it is my trust and they can still lose it. The fact is that I believe this to be shady, and for a company whose motto is "Don't be evil", they raise the standard for trusting them. A standard which I don't believe they are meeting.
Yes. Google damn well knows that no major web browser sends the #hashmark part of the URL in the referrer. They damn well know this breaks every single analytics package out there.
I'm amazed at how staunchly biased you're approaching this. Not only does this article take the first leap of paranoia in stating that Google is doing this to sabotage competition, it goes further to say that somehow Google Analytics will be untouched. It is as if that last one was added just to satiate the author's anger:
"I don't know how they're gonna do it, but...but...they're bad...'cause they're gonna hurt competition, so they'll find some way of not hurting themselves, they have to!"
Besides, the hashmark is a standard of its own when it comes to Ajax applications. Google isn't doing anything suspiciously new with that.
And you think this was their intent? Not to provide a better/faster search experience for THEIR users, but to deprive you of referrer data you were used to?
Yes it's a side effect, but it's ridiculous to say that was the intent of the change.
Hmmm. I'm not seeing this new "feature" when I search Google. I've turned on Javascript, I'm accepting Google cookies, I've even logged in. And I still see the usual http://www.google.com/search?q=ajax+crap results URL. I'm running Firefox 3.0.5.
the standard is the referrer being in the HTTP headers, not specifically that if the referrer is google, that it must include the search terms used in their search
we seem to be putting more reliance on these flimsy things... in the future will i create an outage because i @somebody instead of #something on twitter?
sorry, maybe a bad example on my part, my point is that querystring parameters being passed should include the search terms used on that search engine is something of a rule of thumb more so than a standard... but i don't know that there are any standards built on top of the http referrer other than this is how google and many others operated, so these reporting companies wrote code that relied on that, rather than coding to a standard.
I can see your point, but if Google adds that capability to Google Analytics they will have a significant advantage over their competitors. So the question is not simply about following an unofficial standard.
How would they even do that if there's no query data in the referer? Set a cookie on google.com and then read it back from analytics.google.com is the only thing I can think of.
Not really no. If the web browser isn't sending the information, then take that up with your browser... it's the one that has the information and is dropping it.
Then again, I should point out that I'm not seeing this behavior, so this may be much ado about nothing.
Yeah but you can bet your ass Google knows it's breaking every browser out there. If they wanted to make a change this big, they should have contacted the 4 major browser vendors at least 1 year in advance to give them the opportunity to implement this. It's fucking bullshit.
I believe the problem is bigger than what the article points out. Under the new scheme, all search results pages have the same URL. The part after the hash not part of the URL, it is a fragment or anchor.
It breaks my firefox extension that nukes searchwiki. It breaks my custom CSS for the search results page (see userstyles.org). I often bookmark google search results in del.icio.us. Depending on how such services are implemented, different search results pages might show up as the same page.
I'd guess there's just a lot of code out there that assume search results are separate URLs, and that's going to break.
Those two breakages are simply due to the regex identifying the search results pages not matching anymore, and are easily fixable.
There are a variety of things which I don't think anybody has noted. First of all, for those without Javascript, the old Google page is used. So, you'll still see some sampling of keywords coming in... albeit only 2% of what it originally was. So it's not totally ruining things. You'll still get some sample of referring keywords.
The benefits to Google are more than anybody is reading into. It is definitely an intentional move to cloak their redirects and keywords. And it definitely increases performance for them.
Performance gains:
The back/forward buttons do not cause anything to be reloaded. This should save a nice percentage of page loads which were along the lines of "what was my last search?" and going back and forth between pages.
If you look at Firebug, the Ajax request is actually going to the proper google search page... google.com/search?q=... Google is taking the entire search page and essentially replacing it without a page refresh. This doesn't really mean anything significant, it's just a neat observation.
Strategy:
More than anything, in the long run this will improve search results. It's not about locking down the "analytics" arena... as one poster noted, they probably won't be able to show Google search terms either (unless they set a cookie on the Google search page, and read it on GooAnal [yes, GooAnal] pages, which might actually happen). The important thing is: SEO gamers now do not have access to which google keywords work and don't work. Without giving SEO gamers which keywords are working and which aren't, it's going to be a lot tougher to game the system. These SEO firms that manage to get the spammiest crap on the top pages won't have much to work off of other than the raw amount of visitors coming from google. I imagine that Google has secretly rolled out quite a few significant changes in their search algorithm as well, to work in tandem with keeping analytics in the dark.
My prediction is, in a few months, you'll be thanking Google for the much better search results. I welcome this change with open arms, because I'd much rather see quality content rise to the top, rather than gamed content targeted at keywords. The trade-off that people running websites won't be able to see referring keywords... well I guess Google might lose .001% of their market share.
"My prediction is, in a few months, you'll be thanking Google for the much better search results."
I don't really think this is going to change anything. How is this going to affect SEO? I doubt any of them use referrers logs - waste of time. They scrape the SERPs. The old SERPs are still available. They can tell when a rank goes up or down.
It's one thing to tell what rank you are on what keyword, but quite another to track how much traffic you are getting from a specific keyword. Though I guess there's ways to estimate that.
You bring up a good point, though. Guess it's not as grand a scheme as I thought.
Is anyone not getting this on google.com ? It looks like it might be happening for everyone on google.com, but not on, e.g. google.co.uk or google.it .
This is not about analytics. This is just an experiment to test if using ajax instead of the typical form submit can make the result page load faster.
Here's an official statement -
http://searchengineland.com/google-ajax-search-results-death...