Hacker News new | comments | show | ask | jobs | submit login
Google Maps Mania Blog is Dropping Google Maps (plus.google.com)
269 points by chippy 850 days ago | hide | past | web | 97 comments | favorite



What gets me is that Google employees like JohnMu (in the following thread) are actually telling authors to hide large portions of their sites from Google's indexer, in order to prevent all their pages from receiving penalties for "low-quality" content:

https://productforums.google.com/forum/m/?hl=en#!mydiscussio...

That feels like a step back from Google's stated mission to "to organize the world’s information and make it universally accessible and useful":

http://www.google.com/about/company/

Apparently, only the information that maximizes search advertising revenue is worth organizing and making accessible. I understand that fighting SEO spam is an important part of keeping search usable, but there must be a way to do it that doesn't lead to completely cutting out large parts of the "indie" web (e.g. posts in discussion forums, and other sites that host user-generated content).


> Apparently, only the information that maximizes search advertising revenue is worth organizing and making accessible.

I don't even understand the theory of how there is a profit motive here.

How does Google make more money by asking webmasters to hide parts of their site from the indexer?

Doesn't Occam's Razor say that a more likely explanation is that innocent sites are being caught in the never-ending arms race between spammers and Google?

If Google didn't care about the "indie" web, why would Google employees be taking their time to coach indie website operators on how to improve their site's rankings?

I know this is a frustrating situation for everyone involved. But the cynicism that says there is some ulterior motive cuts deep, especially coming from someone I respect like you, Matt.


Hey Josh, I'm genuinely confused by your statement and now curious. Even though you work at Google, you do not see the profit motive in making better search result pages and why Google is investing heavily in improving the search result pages.

I would imagine you actually do see the profit motive for that activity, and yet you rejected the statement above by saying, "I don't even understand the theory of how there is a profit motive here. How does Google make more money by asking webmasters to hide parts of their site from the indexer?"

Could you explain how these things connect in your view?


Sorry, I may not have been clear enough. What didn't make sense to me is why there would be a profit motive for ignoring or not valuing "indie" sites.


Google has always said that advertising does not affect search results. This is one way Google distinguished themselves from other search engines back in the day.

There are sometimes quite a lot of ads on the search results page these days, but they are still clearly distinguished, and I haven't heard anything about that policy changing.


Gonna disagree with "clearly distinguished". Most non-tech-savvy people I know do not tell them apart. Especially since the light yellow background got even lighter with time until it's hardly distinguishable on a cheap screen.


The cynical side of my concern is that search engines under pressure due to spammers will let niche/indie content bear the brunt of the cost, because that content is not "high quality" from the search engine perspective of "we have valuable ad inventory that lets us monetize these searches." Not all searches are profitable. Similarly, "sophisticated" users are less lucrative because we click fewer ads. A bias toward readily available metrics can easily lead to algorithm changes that benefit the mainstream majority while punishing various niche use cases as regretted but ultimately acceptable collateral damage.

The conflict is between "the world's information" (mission) and "the world's information, minus some troublesome long-tail bits that few people care about and aren't profitable anyway" (business motive).

> why would Google employees be taking their time to coach indie website operators on how to improve their site's rankings

This "coaching" is actually Google pushing the spam problem onto content authors, promoting "solutions" that benefit Google while still hurting legit publishers. "Spam has gotten so bad that our algorithms can't classify it without misclassifying you. You can either hide some of your content as a proof that it's not link spam, or have us hide even more on presumption that it is." Yes, this gives Google a powerful tool for weeding out fake/malicious content, but it also hurts the user experience and the mission in the long run, compared to solutions that allow for more manual review, more use of inbound link disavowal, or finer-grained penalties that target only the possibly "low-quality" pages without spilling over to entire sites. Those solutions would probably shift more cost to Google, however.

What's really bad is that some of Google's recent recommendations (e.g. "use the noindex robots meta tag") will remove real knowledge not just from Google but from other present and future search engines (even if those engines wouldn't have the same problems Google did with that content). And even if the algorithm's limitations/errors are eventually resolved, the recommendations will live on much longer in legacy content (and in SEO folklore), meaning that some content will be erased from search permanently for no reason. These are long-term costs that are borne not by Google but by users, publishers, and competing search engines.

Personally, I want to be able to search ALL THE THINGS even if it means that I sometimes have to wade through more crap, and that it's harder work for the search engine to weed out people who are gaming the system. Maybe Google can't offer me that without degrading its more mainstream users. Even if that's the case, it still feels like a departure from principles (or maybe a declaration of defeat) for Google to be making that choice. Even more so when it hurts the ability of other search engines to step in and index the stuff that Google doesn't want.

The reason I want to see Google hew strictly to "the mission" is that it offers a way to let these big-picture issues override things that are more immediately measurable. I know I became an extensive user of Google services mainly because of the way its mission-oriented thinking was visible in its product design choices. That intangible feeling was more important than any particular feature. Keeping it alive requires the ability to say, "Screw it, we're not going to go this route no matter its obvious benefits, simply because it's not in line with the mission." But I also know that it's not possible to do that in every decision; even here in the non-profit, mission-driven Mozilla project, principles alone can't win every fight. (See http://robert.ocallahan.org/2014/05/unnecessary-dichotomy.ht... for an interesting take on this.) And these things tend to look different from the inside; maybe Panda was a case where de-indexing legit content was the least-bad path available. But from the outside, every time "pragmatism" wins over "principle" is a little chink in the armor. And for Google, the outside view is the only one I have.


Personally, I want to be able to search ALL THE THINGS even if it means that I sometimes have to wade through more crap

Agreed 100%. I often start looking at the search results beyond page 10, since that skips past a lot of the "sites that Google's algorithm thinks are relevant, but are actually useless". And thanks to their ranking algorithms, the linkfarms and spammy sites tend to cluster together so it's easy to skip past them too (or just add -site:spammydomain.com to the query and try again).


> Personally, I want to be able to search ALL THE THINGS even if it means that I sometimes have to wade through more crap

Yes, yes, yes, yes, this please! Google used to still have this in the first half of the 2000s or so and it was glorious.

Truly having the world's information at your fingertips. Not just Wikipedia and some high ranking sites, but as your queries got more and more specific, you'd still get results from increasingly strange and dusty corners of the web. Yes that meant you had to evaluate the value and trustworthiness of results yourself (but really, you still do), and it was so very much worth it.


Precisely why we should be supporting companies like DuckDuckGo.


DDG is supported by advertising. I don't see why they'd be immune to the same conflict of interests as Google.


What search engine isn't?


One that you pay for.


Or a non-profit search engine as a public good service --like Wikipedia, or the Internet Archive.

Obviously storage and networking costs would make it impossible today, but it might be feasible in the not-so-distant future. Either way, sooner or later people will realize the absurdity from a public-interest point of view of having "all the world's information" in the hands of a single corporation whose sole legal purpose is maximize its owners' profit --like it happened with other media in the past, like broadcast news.


What search engine is that, then?


Google are effectively altering the search results manually, just without altering their own algorithm. It is perhaps understandable that Google want to avoid making exceptions for individual sites. But a 100% algorithm based approach may be becoming less and less useful.


It's not an ulterior motive, profit is their stated objective.


> I know this is a frustrating situation for everyone involved. But the cynicism that says there is some ulterior motive cuts deep, especially coming from someone I respect like you, Matt.

Ha! HN loves GOOG. Try frequenting this site as an Apple employee...


You get used to it.


>I don't even understand the theory of how there is a >profit motive here. > >How does Google make more money by asking webmasters to >hide parts of their site from the indexer?

Maybe they get paid by competition. After all removing competition with clever TOS tricks is right up Googles alley:

http://pando.com/2014/05/29/after-google-bought-nest-it-remo...

Looks like google saw what bookface was doing and copied. Remove from existence and charge for exposure.


I'm glad you put the link there for context.

make sure that all of your indexed pages are of the highest quality possible and that they are fantastic representatives of your website

JohnMu then gave some examples of very thin pages. It is about a site with 99% of such pages and problems with spammy UGC. Not about maximizing search advertising revenue. It is about quality like described in: http://googlewebmastercentral.blogspot.com/2011/05/more-guid...

I'd like if there was a traffic graph (is an algorithmic penalty visible, trends, or are they just down to 10% of 1.5 years ago?). We have to go by word and own research now. Googling "Google Maps Mania" shows a botched domain transfer as a likely culprit for the low rankings/visitors. The redirect is not 301. Old pages do not redirect to content on the new domain ( http://www.mapsmaniac.com/2013/02/beneath-thunderdome.html ). So any links to mapsmaniac before the domain transfer do not contribute to the new blog at googlemapsmania.blogspot.com . Domain authority is zero'd out. Many well-ranking pages now show a "Page not Found" to Google. Then another site pops up on another domain with the same name. How can Google know for sure the two are from the same owner if the redirects are not set up right?

Fix the redirect. Restore the old links and redirect them to the new site. But maybe much of that is already too late for them. For anyone else in the process of moving a site, see: http://www.mattcutts.com/blog/moving-to-a-new-web-host/


Wait you mean a for profit company's goal is to maximize profit?

We need to stop pretending Google is a rich Mozilla, They're a poor Microsoft.


By what measure is Google poorer than Microsoft? They have less free cash flow, but a greater market cap.

http://ycharts.com/companies/MSFT http://ycharts.com/companies/GOOG


Market cap isn't real. That's just a valuation based on...well, usually, 'feelings'. Microsoft has real money. Google has real money too, but not Microsoft money.

That's why Twitter, which is a tiny company with no profits and probably not much more growth potential, has a higher market cap than some insurance companies an retail chains with good growth, real money and solid long term market potential. "Feelings".


I think I generally agree with what you're saying, but a comment made during a CFR discussion[0] makes me want to point out that those "Feelings" can be of value:

"What about Facebook? What about Twitter? I have no idea what Twitter is good for. But if it flips out every tyrant in the Middle East, I'm interested."

I guess it's just pretty hard to allocate resources in the traditional sense that most people are used to because of leverage that companies like google can provide, but people try anyways.

All in all, I think google is just trying to increase confidence from advertisers in it's content network.

[0]: http://www.cfr.org/technology-and-foreign-policy/technology-...


.... That's a valuation based on "There exists a person who has recently put his money where is mouth is and paid someone $MARKET_CAP/N for 1/N of the company."

Which is hardly to say that you can unload the entire company at that price, or that it'll last, but it's a step above mere 'feelings'. :b


I think you nailed it there. Market cap is valuation and subject to change on a whim [edit- specifically, it's a measure of sentiment; it's a prediction of future cash flows] . Free cash flow is probably a more accurate measurement of the "poor" vs. "rich" concept. Let's be honest though- both companies are filthy rich.


i had this conversation with a friend on why are tech companies the darlings of public opinion but guilty of such terrible things. look how Monsanto is the Devil Incorporated, while Apple has in US tax evaded cash more money than around 2 times Monsanto's market cap (Monsanato trades at a frothy 24 PE). Walmart is everything evil despite cant do no wrong Amazon does everything they (in a negative sense) do only many times greater.

Meanwhile Google has done terrible things including ripping off bandwidth providers, and not to mention they hold a monopoly on search. conspiracy theorists seemed ill concerned on one organization responsible for answering all their web queries. could go on about outsourcing of jobs, h1b visas, causing a recession, etc etc.

not that any of these companies are 'evil' or even 'wrong' in my eyes, but why do others treat tech companies with such endearment and throw so much hate at a handful of other companies?

we're thinking geography has a lot to do with it. Many influencers on the east and west coasts know a lot of people that work in these companies, and write very positive things about them. not completely altruistic, as they are also heavily invested in them.


Any kind of algorithmic classifier will have false positives for a given level of confidence. The lower the threshold at which you start banning sites the more false positives you will have, there is no silver bullet here.

So when Google decided to crack down on a class of sites sites that share enough traits with the sites that were being axed came into the danger zone and a number of those ended up being penalized enough that they are no longer viable / interesting enough to maintain.

This appears to be one of those cases.

Being dependent on a single entity your hosting, for the traffic you are receiving as well as the service you are directly promoting (in this case maps) makes you an add-on to their eco-system.

I wonder how many small but viable websites are still in this position and how many of them will give up after finding their traffic decimated because they ended up being a false positive in some filter.

It might be more productive to try to figure out what exactly caused the penalty, but I fully understand if the owner of the site just wants to move on and become as independent of this as he can.

Let's hope that 'pointing go google maps' versus 'pointing to open streetmaps' won't trigger another penalty because then he might lose what is still there or a good sized fraction of that.


I'm starting to wonder if killing the "content farms" like About and Answers and the like aren't a case where the cure is worse than the disease.

Often when you search for a "how do I" question, the content farms are the only things that produce a result that's even vaguely topical, at least before Pandas.

I mean, it usually went content farms -> Pure SEO spam -> actual relevant results. Content farms weren't good, but they still beat the SEO spammers. At least the content farms had some minimal connection to the topic, unlike say eBay.


I actually liked sites like eHow, but I'm probably in the minority here. If I wanted a quick answer for a question these sites were providing a good, quick answer. They are not going into a lot of depth, of course, but if I want that then I just use a bit more specific search terms.


They had good answers?

Everything I ever saw on a content farm looked like it was written by somebody who knew nothing about the topic, but did a quick Google search and extracted some semi-random bullet points from the first few things they saw. Basically, I could get the same value by clicking on the first non-content-farm link and then skimming.


The quality varies. Sometimes the content is good.


So now I am supposed to search on eHow directly? Oh noes.


I think the point is that there are multiple sites like eHow, and searching each of them individually and then deciding which results were the best is sub-optimal. Plus, it prevents the serendipitous discovery of new sites like eHow.

If only there were a way to search multiple content farm sites at once, and rank those results according to how useful they were...


Google already does this, by its measure of "useful" (to you).

If you don't like it, you can search those sites directly. For example search for "ehow:how to ask a girl out"

was that so hard?


> Google already does this, by its measure of "useful" (to you).

The point was that they've been successfully more aggressively penalized, to the point where many of these sites are now no longer showing up more.

> If you don't like it, you can search those sites directly. For example search for "ehow:how to ask a girl out" was that so hard?

That defeats the primary purpose of using a general search engine - namely to get ranked results across a large number of sites.


But it is still doing its purpose. You can just tell it what you want to see.

I wish there was a way in Google to indicate you'd like to see results from a site ranked higher FOR YOU. I imagine they already learn that when you click on that domain in two separate searches.


It's interesting that About gets cited as a terribad content farm, but I actually find they have a useful bunch of stuff for me while I'm learning French - handy reminders of rules around converting verbs to adverbs was one I used this week, for example.


Turkey voting for christmas

That was a new phrase for me. It seems to be chiefly British but the Wikipedia page suggest that the American alternative is "turkey voting for thanksgiving." I searched google ngrams for "turkey[s] voting for christmas|thanksgiving" and it seems that the British variant is a relatively recent phenomena. NGrams has no results for the American variants. Is this a common phrase in the UK?

Google NGrams result: https://books.google.com/ngrams/graph?content=turkey+voting+...


From my life in Britain the phrase always brings to mind its usage in Parliament in 1979 (http://en.wikipedia.org/wiki/1979_vote_of_no_confidence_in_t...) which seems to have been the first time many of us encountered it.


It comes from Callaghan's speech (as quoted in the article):

"We can truly say that once the Leader of the Opposition discovered what the Liberals and the SNP would do, she found the courage of their convictions. So, tonight, the Conservative Party, which wants the Act repealed and opposes even devolution, will march through the Lobby with the SNP, which wants independence for Scotland, and with the Liberals, who want to keep the Act. What a massive display of unsullied principle! The minority parties have walked into a trap. If they win, there will be a general election. I am told that the current joke going around the House is that it is the first time in recorded history that turkeys have been known to vote for an early Christmas."


That explanation fits well with the n-grams results. Thank you.


The plural of anecdote is not data, but as a Brit, I am familiar with the phrase.


From the post:

"Tomorrow I'm going to feature the very last Google Maps on Google Maps Mania,

The blog now gets 10% of the Google search traffic it did just 18 months ago. With Google attempting to kill off Google Maps Mania it would be like a turkey voting for Christmas for me to continue to promote Google Maps and the Google Maps API.

Last week I came very close to giving up completely. But despite Google I still think there is an audience for the blog. So from Wednesday Google Maps Mania will be featuring maps created with Open Street Map, Map Box, Leaflet and other map providers.

If you have any Google Maps you want promoting you have about 24 hours left to submit them to Google Maps Mania."

----

The blog is at: http://googlemapsmania.blogspot.co.uk/


I am curious why you pasted the entire content of the linked page as a comment. Is plus.google.com frequently blocked and/or unreachable?


Yes, Google+ is usually blocked on corporate networks. Which is really no big loss.


I think you used the word "usually" to mean "in my narrow and by no means representative experience."


I mean blocked on most major networks by Barracuda and WebSense by default. So, I'll stick with my original statement, thanks.


I think they were more referring to the snarky "no great loss" comment..


I was unsure of the visibility of the page and unsure if the page could be deleted or edited before we had time to discuss it. It was also just a paragraph of text.



I seriously thought from the title "That must be because they've come to hate new Google Maps so much" and was surprised that wasn't the underlying reason.

To be honest, I've always been dubious of any business that is familiar enough with Google's search versions that they name them. It's indicative of being way too dependent on a single entity.


It's not necessarily indicative of being "dependent" on Google organic traffic. Even if organic search drives just 10% of your revenue, that 10% may still be big enough to warrant plenty of attention on the organic search channel. Enough to warrant paying attention to algorithm changes and versions.

I'll never disagree with the idea that you should diversify your marketing and not become dependent on a single channel, especially a channel you cannot control, but to ignore or neglect that channel when it actually matters is not a better or noble solution.

So far in my experience with the new Panda update is that it's actually correcting some sites it probably shouldn't have previously penalized in other Panda iterations.


Publishers are extremely dependent on Google. Traffic referred from Google has a much higher ad CPM than direct traffic. Mainly because transient traffic is much more likely to click on an ad to go elsewhere than your actual users.


It seems like a blog specifically about one company would by definition be dependent on it, no?


OT, but does anyone else have wild zooming issues with the "new" Google Maps on OS X with a scroll-wheel mouse? It borders on unusable as the zoom seems to completely have a mind of its own.

Edit: This seems to be the issue I'm experiencing: https://productforums.google.com/forum/#!topic/maps/6D072fsK...


I have that issue too. It seems to depend on the mouse. It doesn't happen with my Logitech mouse as long as I set the wheel to "clicky" mode.


If you have a mouse with detents, there seems to be no problem.


This is a preview of our AI dominated future. The AI will penalize you and you will have no idea why and have no recourse or appeal. It's also great for plausible deniability if it was indeed a "manual action".


It's a hint at how future search engines will diversify. Eventually Google will become more like the yellow pages where you kind of already know where everything is, you just need its database to get the last mile.

Google will supply the commercial results, but not all of the world's information - only its most profitable information. If you want to know how to do something, how something works, you'll use a different, non-google indexer for searching, say, all of the "how to" sites: e.g. wikipedia, stack exchange, about.com, etc.


If you want to know how to do something, how something works, you'll use a different, non-google indexer for searching, say, all of the "how to" sites: e.g. wikipedia, stack exchange, about.com, etc.

I've gradually started to do this using DuckDuckGo's bangs [1]. It actually works pretty well if I know exactly which site I want to search. I do miss Google's ability to filter by time, though.

[1] https://duckduckgo.com/bang.html


Hail there, fellow out-of-liner in sub-zero downvote land! I guess by not towing the party line about Google, we're getting downvoted.

Nevertheless, I wanted to take the time to thank you for your information about DDG's bangs. I've been using DDG since Snowden, but didn't know bout bangs! Looks very useful and shorter than typing "site:" in google. It kind of reminds me of multireddits


As a (very) long-time user of the Internet, I've become increasingly disgusted at what it's become, in particular the rise of the whole "SEO phenomenon" that's basically become a cat-and-mouse game between site owners and search engines. Large corporations and SEO spammers have the resources to play this game, while the smaller independent sites (often managed by a single person) don't, and when they get penalised somehow, it's much harder for them to solve the "problem". The often interesting, unique, and more personal bits of the Internet are being made less accessible.

I think the reason it became this way is entirely due to search engines encouraging this ranking game. Those who want their site to be at the top of the SERPs optimise for that, instead of focusing on what makes their sites' content valuable to actual users. The incentives are all wrong, from the perspective of what the Internet should be, and to fix this I think search engines should, instead of trying to develop more and more complex algorithms that the SEOs will just figure out how to beat, use a (periodically changing) random ranking. This would probably kill off SEO completely, as there would no longer be anything about the search engine to optimise for - everyone who wants visitors would instead focus on that content, so on those times when their site appears near the top of the rankings, they can attract and retain users who will preferably bookmark them and visit again.


> I've become increasingly disgusted at what it's become, in particular the rise of the whole "SEO phenomenon" that's basically become a cat-and-mouse game between site owners and search engines. Large corporations and SEO spammers have the resources to play this game, while the smaller independent site

It's essentially a replay of what killed Usenet. And the "SEO experts" are every bit the vile blight on the web as spammers were on Usenet.


The "SEO phenomenon" has been going on for well over two decades now.

And if you knew how to search, it was a nuisance, but could be dealt with.

The (relatively) "new" part of it is Google throwing out the babies with the bathwater.


A random ranking? Of all results? Are you kidding? The vast, vast majority of results for a typical query are tangentially relevant at best.


I don't mean all results that may or may not even have the search terms (which is what Google "helpfully" tries to do), but all results that actually have the given search terms. Hopefully this will mean users learn to use more specific search queries to get what they want, and thus ask more specific questions --- not a bad thing at all for the Internet and society in general either.

The biggest advantage of a random ranking IMHO is that it completely eliminates any incentive to game the ranking system, and gives the "indie" portion of the Web a fairer chance at getting visitors. Let the users decide themselves individually what's relevant to them, and not Uncle Google.


This site was due for a revamp anyway. Nowadays some of the most exciting maps are created with OpenStreetMap -- and the blog was featuring most of them anyway. The Google Maps name was not giving OpenStreetMap its due credit.


How is Google trying to kill off Google Maps Mania?


The scourge of low quality content has Google putting more and more pest control into their algorithm. That has started killing off the traffic to actual quality sites.


Most notably, it's really taken a hard hit to MetaFilter[1]; the site had been funded by a $5 fee to join (post/comment) and Google Adwords on AskMetaFilter[2]. Now they've had to lay off several moderators which contributed to the sites high quality.

[1] http://metafilter.com [2] http://ask.metafilter.com


From the comments: Google searches 'Panda' updates. More info on Panda here - http://searchengineland.com/metafilter-penalized-google-1921...


While this makes a certain amount of sense (not promoting a tool from a company that is not promoting your content as well in another arena), I'm actually glad that Google works in a blanket manner rather than an individual manner. Assessing "manual" promotions of domains could get really sticky really quickly.


What makes you confident that Google doesn't manually adjust weighting and ranking for individual sites? I've always suspected they might -- both increasing and decreasing ranking -- although they won't want this to be well-known for legal and other reasons.

The legal reasons _might_ be enough to keep them from doing it at all, it's true, but I've always suspected they do it anyway.

I mean, we know they do it to penalize certain sites they believe violated their ad rules, right? So we know their systems at least possess the technical ability to manually penalize sites.

But I'm not an expert in this stuff, which is why I start with a question (it wasn't rhetorical!).


Google does engage in manual actions, but such actions are always reported in webmaster tools.


And this is why you shouldn't rely on Google for your traffic, but diversify off of it asap. In fact dont rely on any one company. Use the open web.


Could you give some examples of alternative ways to get Google quality traffic that fall under "the open web"?


Exactly. I want to be edified as well. I have not found another way to get organic traffic.


Develop content that people would share with one another, and find on other sites.

Videos - develop videos that drive "traffic" to your website ... Vimeo YouTube 5min or your own site

Blogs - have people in your company start writing blogs and sharing their expertise on the web, and others will naturally want to check out your products ... LinkedIn Blogger or your own site

PR - have people cultivate relationships in forums and with publications and post stuff that your bloggers write

there are many ways to get traffic besides relying on one search engine

and by the way, even ON that search engine, you should be creating all kinds of things to "take over" the first page, if you are really trying to vie for traffic

e.g. on GOOGLE:

youtube videos will show up if you make them

google+ profiles will show up if you make them

local listings, images, maps, bla bla bla

which means Google is telling you to upload a lot of content to GOOGLE owned properties in order to promote your site

so if you really want google traffic that badly you should do it!


https://en.wikipedia.org/wiki/Organic_search

So what you say might answer the first question, but not the follow up one. Well, your "e.g. on GOOGLE:" answers it, but the fact that you use google for that example already shows that there really isn't any way around that. He wasn't asking how to improve ranking. He was asking for alternatives to google search traffic. (Unless I got that wrong)


But I just told you. Social. Linkedin. Vimeo. YouTube. Other things. Sure you don't get the "instant win" of having a big player link to you. But you also are not so dependent on ONLY that big player.

It's kind of like the difference between being discovered by an A&R department of a record label or going the indie route like Ben Haggerty.


But people won't discover my blog posts and videos without Google.

Or do you suppose spamming my Facebook feed promoting my work website is the way to go?


How do you go from "people won't discover my own site under google's new algorithm" to no one on linkedin and other places will ever discover any articles linking to my site without google?


We do post things on LinkedIn and Facebook, however we get only a tiny fraction of the traffic that we get from Google. In addition, the traffic from Google seems to convert to paying customers at a higher rate--not sure why yet.


Can you define Google quality traffic exactly?


In a generic sense, traffic that finds your page relevant (i.e. not someone searching for shoes and ending up on this HN thread).


And how to most people get to the open web? Google. By a huge margin. I see numbers ranging from 66% to 90% depending on how you measure it.


I remember the open web. It was great. You'd get into a WebRing and just keep going.


I think it is healthy to cover all maps providers anyway. I had a rude awakening too a while back when I wanted to do a native Android hack on Google Glass using maps, but there is none of the usual MapActivity, etc. available. So Open Street Map was the way to go. Google Maps sometimes just isn't the solution, so it is good to know many.


I monitor some keywords that relate to obscure topics of interest of mine. If I look at search hits in the last 24 hours I get lots of pages of markov chain generated spam that looks like forums or blogs or community sites. It's freakin' ridiculous. The spammers are using machine learning to fool Google's machine learning.


The only serious traffic I got to my website RoadPetition.com came from when it was featured on Google Maps Mania.

Does anyone here have any feedback for me on this Google Maps Mashup?

http://www.roadpetition.com/


You should probably migrate to the V3 API.


to top it all off, Google Maps Mania runs on Blogspot (a Google property).


As does an awful lot of spam.


I've owned my own web development company for 10 years. In all this time, I've never had a problem with any of my client's sites scoring well in Google search. They have never lost their rank, never been de-listed, never "disappeared", never felt Google was out to get them or that Google was trying to kill them off.

Google has always been helpful, given us free tools, told us what to do and how to do it, what is good and what is bad and how to do good by them.

Then you run into people like this guy.

I must be doing something wrong.


I'm sure the Maps team will be excited to hear about this. On the other hand I wonder what kind of sites this website is competing with. I don't think the site is really best suited to the blog format; maybe it'd be a good idea to rethink the presentation and throw some of the latest SEO strategy in there as well. You can't just make a blog these days and do ordinary blog things and expect to collect the same SERP pension every year.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: