Signals of Quality

Every year there are optimization techniques that become generally accepted best practices. However, before an idea becomes accepted, it sits in this intermediate state where enough evidence of its value in increasing rankings is lacking. Many of these items are treated as signals of quality.

So what is a signal of quality? It's some aspect of a website whose existence indicates that the site is less likely to be engaged in search engine spam techniques. Enough signals should add up to provide a boost to a site when it is compared to other sites that don't provide the same signals. These signals aren't proof, of course, just another piece of the puzzle.

Signals of quality exist everywhere and in many cases we subconsciously respond to them. For instance, if you're looking for a good restaurant and you're particularly nervous about cleanliness you'll look for signals that the kitchen is clean. There's no way for you to know the true state of the kitchen, but you can infer it by answering questions like is the table clean, are the glasses spotless, and is the waiter's uniform professional looking. Restaurant staff that take care of these things are signaling to you that they are also taking care of things you can't see.

With websites, signals of quality run the gamut. Here's a list that is hardly exhaustive, but will hopefully prove useful:

Domain Age
A really old domain is one that existed before anyone cared about SEO. If it existed before anyone thought about gaming the search engines then it's less likely that it is currently trying to game them. Note that domain age is said to be reset whenever there is a change of ownership so a 10 year old domain that just changed hands last month isn't going to provide as strong a signal as it did before it changed owners.

Shared IP Addresses
If an IP has multiple websites associated with it, then it can be inferred that the website owner isn't paying much for the hosting service. Spammers often choose this route to keep their costs low and hence a dedicated IP signals that the owner is truly interested in a long-term, successful web presence.

Code to Text Ratio
Sites that contain 100 KB of HTML code with only 2 KB of content are possibly signaling a lack of sophistication and perhaps a lack of interest in doing what's right for the user (i.e. creating pages that load quickly and feel responsive). Since search engines want to keep their users coming back, they want to send them to sites that are going to be well-received and therefore considered a good search experience.

Note that Rand Fishkin of SEOMoz quotes Vanessa Fox of Google and suggests that code is ignored by Google implying that this ratio doesn't play any role at all.

CSS vs. Tables
There is a lot of debate about the advantages of CSS when it comes to SEO. For me, there are two signals here. The first is that a redesign from tables to CSS is picked up as a site-wide investment in the site. A site that is maintained and updated sends a signal that someone cares about it and therefore is worth a look by the search engines. The second signal is that CSS can improve the code to text ratio (see previous item).

Valid HTML / XHTML
The W3C makes it easy to validate a web page and ensure that it conforms to standards. Since valid web pages almost never occur without a conscious effort to make them error-free, having such pages is a signal that there is someone behind the site that is being careful with their efforts.

Existence of Robots.txt File
This file, which sits in the root folder of a website, provides instructions to websites about what they should and shouldn't index. Without it, search engines are left to assume that all content is fair game. Thus, one could argue that if the file exists and explicitly permits search engines to crawl the site then if all other things were equal, the site that gave permission should beat out a site that didn't.

Numerous Site Wide and Irrelevant Links
Corporate sites are often the worst when it comes to site wide links. Politics and the “too many cooks in the kitchen” syndrome often result in header and footer links that point to every division and subsidiary regardless of whether these other sites are related from a content-perspective. The existence of these links implies that the user to the website isn't important enough to trump corporate egos. And conversely, the absence of such links can signal that the user is king which is a philosophy that Google encourages.

Got any more signals you think I should list?

1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 4.00 out of 5)
Loading...

1 Comment

  1. CSS for me simply lets me position fields of navigation links where I want them, move content to the top of the page, and perform site wide design changes - some of which are a seo benefit, some of which aren't.

    I don't really believe a Robots file present influences Google in any way.

    Sitewide links, I agree with. We're actually removing these where possible from some sites, to help theirs, and ours.

Leave a Reply

Your email address will not be published. Required fields are marked *

Notify me of followup comments via e-mail.