Hacker News new | past | comments | ask | show | jobs | submit login
Google Doesn't Want to Lead You Down Any Dead Ends (41latitude.com)
93 points by joao on Jan 4, 2011 | hide | past | favorite | 36 comments



It is interesting about the maps, but I read the headline with the context of Google search. Any SEO experts know if Google search also penalizes 'dead-end' websites (ones with no outbound links)?


Its a tricky thing to do as the potential for link-spam is very high. All one needs to do to rank his/her site high is to link to some authoritative/reputed site. Edit: One apparent fix is to assign smaller scores to outlinks as compared to inlinks. Even this is easy to game by setting up numerous pages to harvest outlink score, and then direct it to a page as an inlink.

Edit: Erm anything wrong with my comment ? This got voted down to -1.

As a graduate student I look into algorithms like these for ranking web pages.

This was with reference to dead-end websites and awarding a Pagerank like scores based on what my site points to (in addition to those who point to me)


OT: this seems to be one of the most bizarre things I have seen on HN. For some reason my comment and the reply seems to have rubbed some the wrong way. It has been cycling up through -1 and 4.

Not complaining about the down votes, (and surely not about the up votes :) just that it is quite weird, and unlike anything I have experienced here on HN.


There's a difference between specifically penalising dead-end sites (or sites that only send linkjuice internally or to "low quality" sites) and scoring sites positively by number or quality of outbound links.


Yeah sure.I was commenting particularly about awarding scores based on outlinks.

I got down voted, I think because my comment was about web pages and it had got separated visually from the other comments about ranking web pages. So I added it in that my comment was about the discussion that had ensued about web pages.


Yes, or more precisely, good outbound links counts on your favor. I've never heard of a site being "penalized" for not having links, but I'm sure having good links is good for the website (and also, having bad links is bad to your website).


At one time, I thought that outbound links actually drained your PageRank juice. I also seem to recall IBM research coming up with an algorithm that ranked sites as either authorities or hubs, and gave a different kind of positive ranking to sites that did a good job of being hubs for a given search term or terms.


That would be Kleinberg's HITS algorithm.

http://en.wikipedia.org/wiki/HITS_algorithm


The HITS algorithm that you are talking about isn't used much because it is quite easy to game the rankings. There are some partial fixes to it. If I remember correctly the search engine Teoma was using a variant of hits.


I honestly can't say that I have ever seen evidence regarding the value of outgoing links. Also, what counts as a good outbound link?


I believe that my portfolio page is evidence of this, I can't think of any other reason why it'd have such a high pagerank (4). I don't think there are any offsite links pointing to it. (It has the same pr as my homepage and service page, both of which have numerous incoming links.)


If your homepage and your portfolio page are on the same site and on the same url directory, then it could be an artifact of the block rank approximation to Pagerank.

Since Pagerank is an expensive computation, one effective optimization/approximation is to cluster pages into groups and then compute the Pagerank per group. The group-scores are then re-distributed.

I am not sure whether this is what is going on, but it is a possibility.


You might be right. Checking now, every page I have at that top-level directory is either not ranked (effectively 0) or also a 4.

Edit: just found one that is a 3, but it has a trailing slash while the others don't (so it might be considered a directory and computed seperately.)

It also just occurred to me that my xml sitemap assigns a priority to each page, and the ones with a PR of 4 are the ones with a 0.8 priority or higher. There's another top-level page with a 0.5, but it has a trailing slash, and it's a wordpress blog so it has it's own sitemap. So I think that 4 might be independently calculated.


+1 Thanks for digging up on this. It is something that I have interest in.


My pleasure.

Not trying to toot my own horn, but you might find the bookmarklett on this page useful, it certainly made things much easier for me: http://nfriedly.com/pagerank


Oh nice ! I for some reason thought google exposed Pageranks only through their toolbar and that one needed to be logged in.


Yeah... Google thinks that too ;)


Is there a reason why you are sure that out-links are rewarded ? I have not encountered anything which demonstrates that. So I really would like to know.


That's all very well, but how do you explain Castle Ridge Rd? (last map) There are also some loops that have the same weight as dead ends.

Alternative explanation: the municipality surveyors assign prominence-values to roads, based on width etc, which happen to (often) coincide with dead ends being lighter. Google uses these values; competitors don't.

I agree it is a nice effect, both useful and beautiful.


Castle Ridge Rd, while a dead end, also is a trunk that has several branches. It seems like only dead-ends that lead no where else get this width.


This makes sense. "Dead end" is probably defined as whether it connects to any other roads. They wouldn't want to define dead-end as whether it connects to any _non-dead end_ roads.


Obviously, the emphasis is based on the _length_ of the road, data that is easily accessible. Castle Ridge Rd. is long. Given data laying out connected points, it's harder to say what's an intersection, but easy to say how far apart the points are.


Cool, but I'm not convinced that the deciding factor is whether the street is "dead end" or not vs. it just being a really minor street. There's at least one dead end street in the examples that is pretty thick: https://skitch.com/kadavy/r8iie/41latitude-google-maps-doesn...


That street ends in a cul-de-sac. That's not really a dead-end.


There are plenty of other streets on that map that end the same way, but are thinner.


After examining the streets in the town I grew up in (Overland Park, KS), I think it is more likely that you found an error.

Whatever kind of algorithm they are using here, I'm quite impressed by it.


I don't know... 99% of the really thin streets in the examples do seem to be "dead end" streets. I think you just found an error (which wouldn't surprise me, given how many minor errors I've found on Google Maps in the past year).


This is quite interesting; never thought of giving map readers a visual clue to whether a street is a throughway or not...

But this is just for visual representation, and I think the shading is a little too drastic. There were some examples of some pretty long streets that were almost invisible.

Another interesting catch by a very interesting website!


Take a look at London and you'll see it's not only dead-ends that get this treatment. I think there's more going on here… the roads I see showing up less prominently here are the kind that are difficult to drive down (I've been looking around my neighborhood, so speaking from experience)… however they're actually calculating this prominence, it's damned clever.

Edit: in fact, it can even be segments of roads. See https://img.skitch.com/20110104-d3k9f76gttjqbtsg94m85qb3nh.j... (that's all the same street if you zoom in)


The dead-end streets in my neighborhood are physically narrower, too, probably for the same reason. AS a side effect, though, it gets hard for delivery trucks and snow plows to turn around and get back on the main road..


I imagine they're smaller because they see significantly less traffic.


interesting, but i'm not sure this is the whole story.

google maps's stroke width for roads seems to be based on some measure of traffic or connectedness. from my experience, roads that see fewer cars and fewer intersections tend to show up less prominently. since google has vector data for maps, my guess is they algorithmically weigh and display routes and this leads to dead ends being fainter than other roads


They could actually weight it based off the number of gps "check-ins" that road sees.


He is assuming this was an aesthetic choice. It's entirely possible they weighted dead ends lower for the algorithm and the tiles just came out like that when they rendered them.


What’s the difference?


I read the headline with the context of business models, and businesses that grow to rely on Google, and then, one day, find out: uh-oh.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: