How Google Evaluates Links
Google recently announced “40 changes for February” but the big one from an SEO’s point of view was the Link evaluation
Link evaluation. We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.
If this didn’t send a small giggle or a large shudder though the SEO industry not sure what will..
Links have been the cornerstone of Google even since Larry Page came up with PageRank, but Links have been the easiest part of the algorithm to game and it’s got to end sometime 🙂
So let’s try and break down what Google said “We often use characteristics of links to help us figure out the topic of a linked page”,
What are the characteristics of links?
Anchor text
Well it does have the feeling that anchor text would be the prime suspect to be discounted towards the topic of a page, I guess that would stop Google bowling or bombing like the terms terrorist sympathizer

Or could it be other characteristics
Pagerank
Google’s scoring algorithm of links, named after Larry Page and used by Google forever, it basically assigns a numerical value at page level which is calculated by how many links Google can find pointing to the linked page. The quick fix for this would be just turnoff Toolbar PageRank
Content Proximity
Was it ever a bigger factor? Incontent links have always felt more weighty and natural.
As an example: David Naylor a well known and respected SEO gave a very informative speech at the recent SES Search conference with other leading SEO’s Richard Baxter of SeoGadget and Neil Walker of Just Search. There is a good round up here
the text surrounding “respected seo” and “here” relates to SES www.searchenginestrategies.com, a SEO related conference and it’s in close proximity to two other SEO companies so it could be said that the text “here” could be assigned the same anchor text related values has the “Respected Seo”
Page Proximity
Where on the page the link is actually placed
If you take my site it’s easy to see the structure we opted for
1) Header : site navigation links
2) Side links : to external sites, Bronco and Becky Naylor
3) Content : internal links
4) Side bar : recent blog links
5) Promo area : links to my tools
6) Footer : call to action links and repeat header links
Now my site is clean and doesn’t sell links, but if I did, the promo area (5) would be the first place for a nice juicy one off link, then the sidebar(4) and footer(6) areas would be the next places for site wides, easily detectable by Google and most probably already is 😉
Nofollow
I just can’t feel this one at all, NoFollow was brought in by the major Search engine to fight Link Spam I can’t see them removing it for everyone now
Link attributes : Title,Type and Rel
again not a strong signal but could have a slight score, but with Google pushing rel-author it would seem unlikely that they would stop looking at the attributes:
<a href=”album.html”>My Pictures</a>
<a href=”../images/me.jpg”>Picture of me</a>
<a accesskey=”A” type=”audio/midi” href=”/sounds/singing.mid”>Dave singing (MIDI)</a>
<a title=”How Google evaluates links – part 2 ” href=”section2.html” rel=”next”>part 2</a>
<a title=”Feedback on my blog” href=”mailto:me@davidnaylor.co.uk”>me@davidnaylor.co.uk</a>
<a href=”http://www.bronco.co.uk/”>Web Design</a>
<a href=”index.html.de” hreflang=”de”>German version</a>
But, the big thing here is “we are turning off a method of link analysis that we used for several years” not “we have turned off a method of link analysis that we used for several years” whatever Google are doing I don’t think it’s rolled out just yet!
Update :
After thinking a little more on this last night I wanted to add some more ideas on what type of links it could be that Google are going to start discounting..
Affiliate Links : this would make sense. I know we have seen a dampening over the years but maybe a straight out cull in in the offing. This would affect some big players like Amazon and Ebay I guess. If you check out the Visibility of Amazon today in the US

Not sure if this is the case or something totally different that Amazon has done.
301 redirected pages
This is where a page of content is pushed out on to the web then links built into that page only for it to be 301'd into another site. Again this plays out a little in the affiliate link scenario but maybe more of an attack at SEO's rather than affiliates, but I have seen this method work well when you 301 the content into a Merchant with your Affiliate ID (Just saying)
or what about links into your old pages that you 301 into other pages of your site?
Short urls
can these be really trusted, I have seen the old flip flop on these :
Shorturl – pointing at NEWS story
Then the news article is pushed out everyone starts using the short url.
Shorturl – then swapped to fake news site
Age of links
this keeps coming up. If a link has been on the web forever should the equity keep on building or should it have a shelf life? I always felt that Google could have easily looked at links in a 6 month window only, I feel this plays into the hands of the link builders though. Age of links also plays into the 301 redirected target page, as you see the more we think about this the more it twists and turns..
My thoughts
And my second to final thought … what if Google actually wants to remove the LINK part of the algorithm? Surely they would start off by nibbling at little sections of the LINK analysis part of the algorithm, then test in sectors that we aren't looking at until they are happy and can roll out to the masses, I don't think we have seen this in the UK yet.
Final thought…if it's Anchor Text this time but the rest will follow and this is why we need to look at how and what panda stands for, it's a content identifier it tells Google many things:
a) This is content thin
b) This is Affiliate led content
c) This content is about this subject
d) This content can be trusted
e) People like this content
If the content was all about Logitech as an example, then surely any links on that URL stub should lead to Keyboards, Mice, Speaker it shouldn't lead to Russian brides, Search Engine optimisation or even website design, and therefore killing sidebar links and footer links and even dodgy in-content links. But then again could Panda categorise your site not only into good or bad, but in industry, and your links cross industry count more than links out of your sector. Panda could say you're a trusted news site but not trusted for product reviews due to affiliate links, it's endless really but that's why we love this industry and if this has been rolled out in the UK… Keep Calm and Carry on

44 Comments
Steven Holmes
I have a strong feeling it’s to do with your first point – anchortext.
Google has been placing increasing emphasis on quality, frequently generated on-site content over the past year – almost every major update they have implemented points to this. The factoring in of social media signals also plays into the hands of quality content producers.
However, the likelihood of quality content generating an exact anchortext link is very small indeed – the best you’re going to get out of it is a domain/brand link in most cases, but if it’s naturally acquired then it should have more impact than a manually placed exact anchortext link, no?
If they do away with analysing the anchortext of a link, the argument for producing quality content and attracting links the ethical way is strengthened.
That’s my theory anyway.
Dennis Sievers
Hi Dave,
Barry is also mentioning the “age” of the link as a possible factor in his poll that has been phased out.
Gavelect - http://www.eqtr.com/
Hi Dave
I cant imagine it being anchor text, if was to be the case I would expect to see major movement in the SERPS.
It will be interesting to see what you guys come up with over the next few weeks.
Michael Cropper - http://www.michaelcropper.co.uk
Nice and cryptic from Google as usual….
My gut is telling me they will be focusing more towards social mentions (aka links) opposed to other standard links as social data is harder to game, especially when you take into account the person/social profile which has provided the link.
Whatever Google are doing here, then as long as people are building links for other people then I don’t believe this will have any impact / change in link building. If people are building links for Google then it may need a change in thinking..
Thomas Schmitz - http://schmitzmarketing.com
Age is interesting. In some markets or niches old (very old) reciprocal links still seem to be among the majority of links that top ranking sites have. It’s like they were grandfathered in and those sites never lost their rankings, even those that have not done serious link building or social media since those long past days.
fantomaster - http://fantomaster.com/
Matt not giving any tangible details (not unexpectedly, of course), I’d venture a guess that this is worded to provoke lots of uncertainty once more in the SEO & link building world. Wouldn’t be the first time, just like retaining the TBPR keeps lots of folks ever so busy second guessing the big Goo.
Not that I disagree with anything you’ve written here, Dave – it’s just that it could be something far more complex (e.g. a combo of widely divergent factors): yet, how would we really know except for ages of testing & speculating? While I doubt it’s a red herring altogether, even that tactics wouldn’t be beyond them, no?
Dixon Jones
Killing age as a factor might work well.
If they switched off PRank altogether, then the results would have changed so dramatically that this must be unlikely (I think).
Frankly I think NoFollow never did what it was meant to do, but if they switched that off, the results would probably skew alot and I think they would outright tell us.
Curtis - http://www.curtisnoble.com
I think it’s probably already been turned off. Forums (especially the dark hat types) are buzzing with new threads about rank drops. Many are even saying they’re receiving messages in Webmaster Tools that suggest they have been caught using “link schemes.” If google announced they implemented 40 new algo updates this month, and only one update had to do with links…I think it’s already in effect. I see steady increases in search traffic since about 2/17. And several others are seeing the notices in Webmaster Central along with traffic decreases around the same date.
Sammy G
If we look back after 20 years from now, the whole page rank link analysis to score website relevance will look like childish idea.
A search engine is on the decline when:
– they start tweaking the algorithm daily (google id doing it now)
– they start tracking click through (google is doing it now)
– they start selling domain, web hosting, e-mail hosting (google id doing it now) they do this when they are doubtful if the advertisement income will sustain over period of time.
James Crawford
Anchor text would make sense.
If you get a link from the Guardian or wherever then it’s often authoritative but just a branded link.
Context could be derived elsewhere.
If this is the case then it’s good news for PR people like me.
Clifton Flack (@cliftonflack) - http://www.cliftonflack.com
Good review David.. shame we can’t come up with any workable assumptions 🙂
Whats amazing (or not depending on your take) is there was NO mention of Google+ as a factor in the updates…
Can we presume G+ is not impacting search updates or just that Google took such a bashing they’re too scared to explicitly mention it??
One of their updates “More accurate detection of official pages.” would be the most likely candidate
Ross - http://www.organic-development.com
I might have to Digg into this, pun intended. I get a feeling part of it is at least to do with what sites it finds authoritative now and what ones it no longer does. Even though links are no-follow on Twitter, they still pass value, almost as if they’re white-listed. Other sites might no longer be in that list.
Yousaf - http://www.elevatelocal.co.uk
Insightful blog post David. Did you read my post latest post on Amazon.com being hit by panda?
Rory - http://www.pizza-diary.co.uk/
Do you guys think it might have anything with the source of the link? – so, we know that Google likes links from .gov and .edu domains, but this just means they have become more of a target for spammers. Perhaps if Google factors out any authority it would naturally give to these domains, they might become less of a source for black-hattery?
(I’m just an SEO noob, trying to make my way in the world – if my comment is tinged with ignorance or evern just plain wrong, I apologise in advance!)
Chris Gedge - http://www.further.co.uk
Very interesting dicussion on this here: https://plus.google.com/114074532743058808065/posts/NAquHkxYkaC. Anchor text seems to be disproven near the end of the thread due to various reasons. Personally I think it is something smaller that will be harder to notice, such as bolding, surrounding text or placement on page.
Flypark - http://www.flypark.co.uk
They also announced they updated the panda algorithm as well, any ideas as to what changes they might have made?
David Naylor
@flypark Not thought about it tbh
@chris I would like to agree but we don’t know if it’s even rolled out fully yet, gut feeling is anchor text, but i think it’s only stage 1
@rory they would be better hit platforms than tld’s
@yousaf No i haven’t .. no my way now 🙂
Carps - http://www.trusteddealers.co.uk
I buy the discounting of affiliate links – just seems to make perfect sense to me, given the context of so many of Google’s other changes (targeting thin content, brand weight etc) seems to be about gunning for affiliates.
I used to work on a site that has ranked top 1-2 for a couple of keywords for years, and the *only* thing supporting that has been a cunningly crafted affiliate/link program. Still sitting there today, but I’ll be watching closely to see if it dips over the next few weeks.
David Naylor
@carps they other thing is will they stop discounting has of now or backdate 🙂
ScottB (@TheRealBoydo) - http://www.fusednation.com
What about 301 redirects? Been used for a long time to snatch up expired domains for link equity purposes. And a lot of people are transfering content all over the place as a Panda solution.
Could be Google is removing 301 redirect ability to dictate the subject matter of the page (but still pass a percentage of PR as is the case just now). It wouldn’t impact site moves, URL changes, etc – Google would just need to reassess what pages are all about as they index them rather than relying on 301 redirects to pass any kind of historically information about the page’s theme / subject area.
Given Panda is all about thin content, etc it’s safe to assume that Google has some serious technology at the ‘plex to determine page context, topic, quality, whatever on the fly, so realistically they don’t really need the contextual info provided by 301’s (just the IBL info really). Although this would require more processing resources (whereas relying on 301’s is a relatively simple solution), it would remove the margin for error that people have been capitalising on for spam / link building purposes.
Google+ authorship, WMT address changes, rel=cannonical all perform similar functions but in a higher quality way (i.e. more difficult to manipulate).
There would actually be very minimal impact on legit sites who use 301’s (Google can determin the context of content via other methods and PR is still attributed normally) vs sites using 301’s to artificially inflate rankings (they would be left to rank on the merits of their content rather than the history of the URLs from another site).
Just speculation but I did see some strange behaviour with mass 301’s implemented in January (but tbh that could be a 100 other things, so not relying on that as proof of anything).
Aidan - http://adsurf.com.au
I’ve been trying to work this out for two days now, the bit that catches my attention is how they start the statement referring to how links are used to figure out the topic of the target page. This seems to imply that the anchor text is the subject of the change but that sentence can also stand alone in the statement without any connection to the next sentence as if deliberately put there as double speak in Google style.
I’m now favouring its a tweaking of exact match anchor text handling to try to lessen the values passed to exact match domains…
Chris McGiffen - http://chris.mcgiffen.net
I agree with Aidan re the first sentence, definitely suggests it is about anchor text/context.
Next part also is interesting though: “We have changed the way in which we evaluate links” – that suggests that they have already changed what they consider to be important within the algorithm; the part they are switching off could be superseded by changes already made or such a weak signal as to have minimal effect.
Asif Anwar - http://www.seoppcsmm.com
Hi Dave, thanks for sharing your thoughts about Link Evaluation update by Google. I also want to share some of my views.
I also believe that Google does need to rethink about 301 redirects. Many blackhaters are using the strategy to increase their page rank. However, in 2010, Matt Cutts admitted that Google does not flow 100% link juice through 301 redirect (Source: http://www.marketingpilgrim.com/2010/03/google-confirms-301-redirects-result-in-pagerank-loss.html). But, I have also seen people buying PR domains just to increase their PR. I think Google should consider down-scoring 301 redirects from multiple sites and pass the link juice only from the best site only.
I still think Age of the domain is still important for scoring authority. But, the link age should be quite fresh. I mean fresh pages in old domains should rank higher.
One of the worst practice to manipulate rankings in SEO is using scrapbox and other blackhat and greyhat tools to blast backlinks, especially in social bookmarking and forum sites without any authority. Google might think of turning off any link juice from such sites. With Panda, Google has been concentrating more on Onpage Content Quality. But, with the Link Evaluation Update, they are more concentration on scoring sites based on the quality links they have, not the quantity of links.
Another thing, you missed in the Social Signal. Matt Cutts previously mentioned that they have the technology to see the quantity and quality of the social influence as well as the quality of the social influencer. So, in case of Social Signals, I think the main thing they want to see is not the quantity of link shares, but the quality of the link shares.
Google has the keyword stemming technology and they also have the duplicate content identification technology. By combining together these technology, they really can identify, which site has built links with duplicate contents and which link was built voluntarily.
Asif Anwar - http://www.seoppcsmm.com
BTW, Google recently has made changes to their Privacy Policy. Can anyone think if that can help them evaluate link in better way. Since they can use AdSense and AdWords into their Algorithm now, how do you think it may impact link evaluation?
Martin - http://www.martinduguay.com
Hi Dave,
What do you think about the server location? For example, client’s website is hosted in Canada but almost links coming from websites hosted in U.K. Bad or not?
Do you think is better to get links from sites hosted in same country as client’s site is?
Ken Howard - http://www.simplicatedweb.com
David, these are excellent possibilities. I’m reluctant to believe everything is stated in the monthly updates. I believe Google+ is a much larger ranking factor than has been documented. The way Google is evaluating links has traditionally been through anchor text and the quantity of links pointing at a domain. Now, with G+ they can see shares and conversations all without sending Googlebot out to find these links. They didn’t create Google+ to beat Facebook, they created it to capture data about links. In my opinion.
Ron - http://www.affinitytrack.com
Note to Google:
If you haven’t anything to announce … why announce anything at all?
David Naylor
@martin I think server location plays a very very small part these days
@ken the problem I see is that this post got 214 tweets 32 facebook likes and 9+1 no data really
imnotadoctor - http://www.imnotadoctor.com
I noticed something happening in the last 2 weeks. Sites with really spammy backlink profiles (splogs, blog comments, articles dir, paid links) are seeing drops of 5-15 rankings sometimes more for exact phrases that looked to be over anchor text optimized.
This leads me to believe two things.
1. The elimination of anchor text signals
2. The detection of spammy links (not sure what signal) and the removal of them from the link graph.
imnotadoctor - http://www.imnotadoctor.com
Another idea not mentioned:
1. Anchor Text Capping:
According to Google’s patent: “may cap the impact of suspect anchors on the score of the associated document”.
This sounds like Google’s answer to over optimized anchor text problems. Basically you only get x amount of anchor text credit effectiveness over a period of time. Reminds me a movie quote SEOed “Choose your anchor text links wisely!”
I wrote about this on my blog post a few weeks back.
Bibliotheken en het Digitale Leven in Februari 2012 | Dee'tjes - pingback
[…] How Google Evaluates Links Research […]
Greg
The change they are referring to is definitely anchor text. And as usual, they aren’t being completely honest about when they started using it. The bottom line is that anchor text has become the new keyword meta tag in terms of being completely worthless as a quality signal. (In their eyes) And they have been moving away from it at a rapid pace since caffeine was rolled out.
John William - http://bit.ly/wqPaMU
I think this move will make it more different from bing. let see who will be the winner bing or Google. 🙂
Julian - http://juliangrainger.co.uk
Hi David
One option that I’m exploring is how they attribute topic to the link from the URL.
Google signaled last year they are looking at ways of de-powering exact match domains. Part of the power of an exact match domain is the default topic within a link that does not have anchor text.
Personally I don’t think they will remove anchor text describers in themselves but if they remove the weight off the URL it removes an unfair advantage many affiliates have been using to game the system.
Julian
Carla Lendor - http://www.patantconsult.com
I don’t agree that a site on logitech keyboards linking to a site on Russian brides should raise a red flag. Keep in mind that words meaning are contextual. What if Russian brides. Want logitech keyboards as gifts, does that not give the link relevance.?
David Naylor
@carla no it doesn’t IMO the only way link relevance really works is if all the Russian Brides out there wanted a Loving husband, a Nice house, Large wedding, 2 kids, BMW and a Logitech Keyboard… My guess is that a Logitech keyboard isn’t top of most Russian Brides wedding lists
AfroBritish - http://afrocosmopolitan.com
Google is always changing the it evaluates its whole algorithm in order for people not to misuse it. However, it just makes it difficult for people to know what they are doing wrong or right. Especially when they are not so good with all the SEO hulaboo.
Anton Koekemoer - http://www.antonkoekemoer.com/
Great Post! I enjoyed reading through all the comments. Made for some great reading material.
Thanks so much.
James - http://quumf.com/
Google seem to be talking specifically about how they judge the topic aspects of the linking relationship:
“We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links”
There seem to be a couple of areas where this could apply. A change in the way anchor text is being used seems legit. The vast majority of links that are editorially given these days come from Twitter and Facebook, which use different types of anchor. Facebook typically takes the full title from the page while Twitter uses the URL and redirects it through t.co. exact match anchor links are increasingly rare as a % of all the links on the internet, and as such are less representative as a means of determining relevance.
The other area that can define the topic is the relationship between the content on the linking page and the content of the linked page, however this is also changing. A link from within the twitter stream will typically be surrounded by a lot of unrelated content:
* Tweet about a picture of a cat
* Tweet about a celebrity falling over
* Tweet about an SEO blog about how Google evaluates links. heresthelink
* Tweet about how angry I am about the train being late
Only 140 characters out of 3000 on a page may be relevant to a link, but the content around that link is specific to the subject and adds value even if the wider page isn’t
It’s conceivable that Google are looking at areas of a page in addition to the whole page in calculating topic relevance. The organic web lacks structure at a page level that you see in an optimised website, however sections of content within a given page may be highly relevant to a particular subject. It would make sense for Google to move to an evaluation model that increases the value at a paragraph or div level within the page, which would reinforce the idea that in-content links are of the highest value.
Fergus Clawson - http://www.blueclaw.co.uk
I think Google is turning off PR and replacing it with Author Rank (AR) . This is Google’s plan with G+. Also Google will devalue spam based anchor txt, especially non-brand anchors within thin content, so if your link footprint hasn’t many brand anchors and hasn’t enough AR then you could be in for a tumble.
Diane - http://www.voucherfreebies.co.uk/
I think Keep Calm and Carry On is the best advice. You could get all het up and stressed worrying about what google thinks about links today. I think the only conclusion is that we don’t really know what they think of links.
Whatever they’ve done to their search listings they should start paying attention to their own info too. They can point out what area and country I’m in but seemingly fail to provide me country specific urls.
I don’t want to buy plants from the USA as I suspect they’ll either not be allowed in, or be dead by the time they arrive.
Chris Rendell - https://twitter.com/ChrisRendell
I’m not sure how it could be anchortext. Although their are several different factors that determine what a web page is about, surely anchortext is the most important? There would need to be a replacement or some sort.
I hope it’s pagerank, it would cull a lot of useless SEO’s who still bang on about PR.
Suneeta Abraham - http://www.seotrafficsearch.com
Your post contains the core concept that the google algorithm use. For all good there are even malpractices involved in increasing the page ranking of the website. This is an eye opener for most of the SEOs. How you have defined each and every aspects involved in page ranking is great. Overall a great post to go through.
Kevin Blumer - http://www.kbos2.co.uk/
And then we had the next update a few weeks past and i hear a lot of people moaning to me that Google has changed the way it looks at links again. I know a few people who were in the double figures income and it has gone. Might be wrong here but i think you have too many good links high authority you will get penalized. I did’t so i am ok i went up the search a little. The one thing i have never done is put a keyword in an anchor and that has probably helped me now.