Thursday 27 June 2013

Infographic: 2013 SEO Ranking Factors, From SearchMetrics

The folks at SearchMetrics have released their SEO ranking factors for 2013. This is their ranking correlation study they worked on for the past 3-months, right after the second-generation Penguin update was pushed out by Google. Want to understand everything that was in it? They’ve got an infographic for that (click to enlarge):
us_ranking_factors_2013
The key takeaways from their study included:
  • Keyword domains and keyword links have lost relevance
  • Brands are the exception to many rules
  • Social signals continue to correlate very well with better rankings
  • Good content is always important: it comes down to quality
  • The number of backlinks remains immensely important
  • On-page technology remains one of the basics
There’s been some criticism around this study on social networks, but like with any ranking correlation study, there are always critiques, and there should be. You can download the study over here.
Also be sure to see our Periodic Table Of SEO Success Factors, which was freshly updated earlier this month, for our own take on ranking factors to consider.

Saturday 22 June 2013

Should SEOs Redirect or Park For Geolocation?

Should SEOs Redirect or Park For Geolocation?
Geolocation is the process of associating a geographic area (usually a country) to a website or website page. This is done in order to help search engines provide results that are not only relevant to a searchers keyword, but also location.
For example, there are certain professions, like lawyers, that are restricted to geographic locations - a lawyer from Poland probably will not be helpful to a searcher in South Africa, regardless of how well the Polish lawyers website is optimized. Likewise, many people prefer to buy from vendors that are based in their own country - not only is shipping not as much of a concern, but local consumer protection laws are more likely to be helpful if their is a problem.

Definitions
ccTLD

This refers to Country Code Top Level Domain. Normal TLD's such as .com, .net, and .org refer to they type of website the domain belongs to (commercial, network, or organization), whereas a ccTLD refers to where the domain belongs - ie a country. Common examples include Canada (.ca) UK (.co.uk) France (.fr) and so forth. These codes are assigned by the Internet Assigned Numbers Authority (IANA).
Geolocation

The act of assigning a geographic location to a website or webpage.
ccLinks

Incoming back links (IBL's) that point from another website to yours, where that other website has a specific country code associated with it. For example, a link from a German website would be a ccLink, where a link from a normal .com with no country geolocated to it would be considered a normal link. If a site gets enough ccLinks related to a specific country pointing to it, it will often be assumed to be relevant to that country by some search engines.
IP Geolocation
In late 2003, I wrote a fairly well received article called "Only In Canada, Eh?" which outlined several issues and possible solutions regarding geolocation. However, it's now late 2005 and the article is starting to show it's age. The search engines have not stood still in the interim, and there are more rules to work with. A fairly complete discussion of the changes can be found at the High Rankings Forum in a thread called: "mcanerin - Only in Canada, Eh?". This article hopes to summarize that thread and to clarify some parts of it, especially related to redirects and parks.

The main conclusion of the original article was that it would be best to host your website on an IP address in your country. This allows Google to associate your website with a specific country easily, regardless of what it's domain extension is. This is still true today, but there are other considerations.

Other Search Engines Don't Use IP Geolocation
Google is the only major search engine currently that uses IP address for geolocation. The reason that Yahoo won't is quite simple - they know very well that many people host websites outside of their own country. As a matter of fact, Yahoo hosts websites from all over the world, and since it's IP's are US, following Google's IP focussed method would not accurately reflect the reality that Yahoo knows is going one. Likewise, the folks at ASK are well aware that the preferred hosting location for large Chinese websites is Japan. Google's IP fixation is apparently more due to a lack of experience outside of the US than an accurate reflection of the realities of web hosting in the rest of the world.
The non-Google search engines use 2 methods of discovering what area a web site is relevant to: 1) the ccTLD (Country Code Top Level Domain) and 2) link analysis - they assume that if a very large number of German websites link to you, that you must be relevant to Germans, regardless of your hosting location.

Geographic Metatags and GeoTags

A lot of people haven't heard of Gigablast, which is too bad, as it's a nice search engine. Gigablast is different from most search engines in that it actually indexes metatags, and lets you search for results using those metatags. It's the only major search engine that actually uses Geolocation Metatags, which look like this:
<meta name="zipcode"        content="87112,87113,87114">
<meta name="city"           content="albuquerque, abq, rio rancho">
<meta name="state"          content="new mexico">
<meta name="country"        content="usa, united states of america">

If you want to experiment with geographic metatags, you can use my free Metatag Generator to do so. There is another option called GeoTags, which actually incorporates latitude and longitude information, as well. Unfortunately, it's not well supported at all.

For now, geographic metatags don't work with the big 4 search engines, and I don't really expect that to change in the near future, so they are interesting, but not practical at this point.

ccLinks

Ok, I just made that term up. For the purposes of this discussion, ccLinks (Country Context Links) are incoming backlinks (IBL's) that the search engine in question considers to be relevant for a particular country. So, for example, if my webpage is considered by Yahoo to be a "Canadian" page, and I link to you, then Yahoo will receive a hint that you are of interest to a Canadian.

If Yahoo get's enough of these from other Canadian sites, they will eventually come to the conclusion that this site is somehow relevant to Canadians. Notice that it's not necessarily assuming that it's a Canadian site, but that Canadians would consider it relevant, which may be a better metric anyway.

So it's possible to get your .com, US hosted site considered to be relevant to the UK if you have enough UK relevant sites (ccLinks) pointing to it. This won't work on Google at this time, but does work on Yahoo, MSN and Ask. This of course can cause issues if you somehow buy/trade a whole bunch of links to your site from Russian link farms, of course.

Further, I believe from testing that each page can only have one country associated with it. In theory, you would have 20 pages on your site all considered belonging to 20 different countries. In theory. In practice, it's something to be cautious about while link building - you want to focus on ccLinks that are from your target country as much as possible. Which should be no surprise since presumably that's part of relevant link building.

ccTLD - The Most Accurate Method

Currently, the best way to make sure that a website is geolocated for a specific area is to make sure it has a ccTLD (Country Code Top Level Domain) for that location. For example, if you wanted to make sure that your website was considered relevant to Canadians, you would make sure that it had a .ca domain extension.

This works very well in all the search engines today, including Google. It's simple and clear.
But there are problems. First, many people already have a well-branded .com site. Second, some countries make it very difficult to get a domain appropriate to that country unless you are incorporated or a citizen there, which can be an issue if you are a multinational corporation, or a travel company that is from the destination location but is marketing to the users location.

So What's This Have To Do With Redirects?
Good question. The answer is that one of the most common use for domain redirects is geolocation using ccTLD's. If you have a .com and want to show up as a country specific site, your best and easiest method of doing so is to use a ccTLD and associating it with the .com website. You can do this by using redirects. I will use .com as the example from now on, but the information below also refers to .net, .org, and other non-country specific domains.

Geolocation is one of the few instances where I recommend a 302 or park instead of a 301.
Scenario 1
I have an existing .com site and I just bought a ccTLD. I want to have the .com website associated with that country.

Answer 1: Use a 302 redirect from the ccTLD to the .com. Create a second account and point the ccTLD to it. Then redirect using a 302 from that account to the .com account.
This tells the search engine that your ccTLD is the "real" domain and that it's being temporarily redirected to the .com. The search engine will index the.com, but keep the  ccTLD as the "original" domain. In short, the .com won't be considered. If you pointed the .com to the ccTLS with a 302, the.com would be kept and you would lose the benefit.

The problem with using a 302 redirect is that it applies on a page by page basis, which is good if you want certain pages of your site to be associated with different countries, but harder when you want the whole site to be associated. In this case, you would need to create a sitemap with the ccTLD coded as an absolute (not relative) link to each page of your site you wanted indexed as belonging to the country.

Answer 2: Park the ccTLD directly on the .com. This associates BOTH domains to the site. Since one is a ccTLD, the site will be considered geolocated. Remember that if you do it this way it may take awhile for the search engine to figure out that this is really the same site, so for a while you will split your link popularity between the 2 domains, until they are merged. This also creates a potential duplication issue during this period.

The good news is that if you use relative links, then usually both domains will eventually be indexed for each page, and your whole site will be considered associated with the ccTLD. If you don't want this, you will have to control it using absolute links or 302 redirects.
Scenario 2
I have a ccTLD website, but I also want to add my new .com onto it without messing anything up.
Answer: In this case, you would do a 301 redirect (not a 302) from the .com domain to the ccTLD. This tells the search engine to pass on all link popularity to the ccTLD, and to not consider the .com as the "proper" website.
Important Step for All Geolocation Redirect Scenarios

In all of the cases above, you need to do link building to the ccTLD. If you only do link building to the .com, there will be no opportunity for a search engine to index or know about the ccTLD. In view of the ccLinks issue that Yahoo, MSN and Ask use, I strongly recommend that you do as much ccLinking to the ccTLD as possible.
Geolocation Spidering Issues

Remember when I said that every page has it's own country potentially linked to it? Each page can only have ONE country linked to it.
How would that be accomplished? Well, the simple method would be to consider all the pages under the ccTLD domain to be long to that ccTLD. Makes sense, right?

Well, it's a little more complicated than that. You see, most sites use relative links within them. Some use absolute. Whether or not you use relative or absolute links can have a difference on whether or not your websites pages are considered geolocated or not.

Imagine that I point mcanerin.ca at mcanerin.com using a 302. In reality, what I'm doing is allowing the .ca domain to resolve any webpage on the site. But I'm not necessarily actually resolving all those pages.
Let's say that a spider follows a link to mcanerin.ca to the default page of my website. Now, what country is going to be associated with that page? Why, Canada, of course. Now, let's say that there is a link on that page that points to mcanerin.com/tools.htm. What country will that page be associated with? Nothing, really. Because the link to it is using the .com extension, not the .ca one!

That's one of the dangers using an absolute link within your website regarding geolocation. The absolute link takes precedence over the previous redirect. Now, if the link had just been to /tools.htm, then the spider would have followed that relative to the .ca domain it was currently in and the page would have been associated with Canada in that case.

One use for absolute links within geolocation is the use of a geolocation sitemap. This is an ordinary sitemap to your site, but with 2 exceptions: 1)it's pointed to with at least one (preferably many) absolute links using the ccTLD, and 2) it contains a list of all the pages in the site that you wish associated with the ccTLD listed using absolute ccTLD links.
In short, it points to every page using the ccTLD in the link, thus associating that ccTLD to each of those pages.

If you don't do this, or alternatively do a lot of link building with the ccTLD and then use relative links throughout the site, you risk having only your home page, and maybe one or two others, being associated with the country, rather than your whole site, unless you use parking.

Conclusion
Geolocation can be a tricky area. I specialize in it and sometimes find it confusing. The key is to remember that each page is treated separately, and to follow the geolocation. Also, remember that a 301 throws away the old and keeps the new, a 302 keeps the old and ignores the new, and a park keeps both. If you keep these things clear in your head while you plan it out, you'll get through it fairly easily.


Wednesday 12 June 2013

Video: Google Glass Search


Googlers Mike LeBeau, who is in charge of search within Glass and Amanda Rosenberg, the product manager of Glass, produced a video showing you what it is like to search Glass for things.

It is pretty impressive and you can get a really good idea of how it works in this 3 minute video.
Mike said on Google+:

One thing that makes Glass awesome is that you can just long press on the touchpad any time and speak to ask Google a question. It’s a super fast way to search which, it turns out, is pretty magical.
+Amanda Rosenberg and I decided to test out a bunch of cool searches and thought you guys would like 'em too - check it out.

Here is the video:




Google Launches A Google+ Dashboard For Managing Business Pages


Google announced yesterday on Google+ a new Google+ Dashboard for businesses that manage their Google+ local page on Google+.

Jade Wang also posted about it in the Google Business Help forum explaining:

It will give business owners using Google+ to manage local pages one place to easily manage the pages, share, and access other tools like AdWords Express and Offers. Local Google+ page managers will also have access to improved page insights.

Google+ users, find the new dashboard simply by logging into plus.google.com and switching to the page you’re managing.

Does this work for everyone? Not really. Those who have not migrated should stay away from here. Jade explained:

Please note -- if you are using Google Places for Business, please continue to do so. New users who are interested in appearing on Google Maps should also start at places.google.com. This announcement only affects business owners who are already using Google+ to manage pages.
Here are some screen shots:

click for full size
click for full size
click for full size



New features include:

1) The ability to update your info (like website URLs, store hours and phone numbers) across Maps, Search and Google+ - all from the Overview tab.

2) One place to monitor your Google+ notifications, assign page managers, share photos and videos - even start a Hangout with followers.

3) At-a-glance access to their AdWords Express and Offers campaigns.

4) Insights that include top searches for their business, top locations requesting driving directions, and performance data for their Google+ posts.

Forum discussion at Google+ & Google Business Help.

Google's Disavow Link Tool: Their Best Spam Reporting Tool Yet



It is finally official, as promised Googlelaunched a disavow link tool yesterday afternoon. It was officially launched during the lunch with Matt Cutts at PubCon Vegas.

Yes, Bing launched one months ago so let's get that out of the way now.

Google's Best Spam Reporting Tool
My big issue, as I said before, this is not a win/win - this is the best spam reporting tool Google has launched to date. Suffering webmasters point fingers at their competitors and friends and blame them for their poor rankings, which Google can use.

Matt Cutts said repeatedly at PubCon, on the video (see below) and in the blog post that you should try not to use it, don't use it, really. Why? One example he said is do not disavow internal links - it can hurt. Right, Google is just using this as a "hint" or "signal" now, like they did with the rel=canonical when that launched, but this will be a powerful signal within 6 months - so be careful if you have to use it.

Will all SEOs use it when they need it? I suspect so. Will some stand up like they did with the nofollow attribute and say - no, we won't use it because it is a form of outing? I suspect so. But 99% will use it in a second if they feel they need it.

How Does The Google Disavow Link Tool Work?

Okay, now that you will likely use it, how does it work? Go to this page (currently not linked within webmaster tools) to see the sites you can disavow links for.

Now pick a site:


Google Disavow Link Tool
Then you will see a warning screen:
Google Disavow Link Tool
Then it will ask you to upload a disavow.txt file:
Google Disavow Link Tool
Here is an example of what a file might look like:
Google Disavow Link Tool file



Now the rest of the screens ask you to confirm. Once you do, you can always delete the file - but it can take a long long time for Google to process those requests.

Here is a video from Matt explaining it in 10 minutes:

Here is the basic disavow help page on Google's help content.Here are some Q&A from the Google blog:

Q: Will most sites need to use this tool?A: No. The vast, vast majority of sites do not need to use this tool in any way. If you’re not sure what the tool does or whether you need to use it, you probably shouldn’t use it.Q: If I disavow links, what exactly does that do? Does Google definitely ignore them?A: This tool allows you to indicate to Google which links you would like to disavow, and Google will typically ignore those links. Much like with rel=”canonical”, this is a strong suggestion rather than a directive—Google reserves the right to trust our own judgment for corner cases, for example—but we will typically use that indication from you when we assess links.Q: How soon after I upload a file will the links be ignored?A: We need to recrawl and reindex the URLs you disavowed before your disavowals go into effect, which can take multiple weeks.Q: Can this tool be used if I'm worried about "negative SEO"?A: The primary purpose of this tool is to help clean up if you've hired a bad SEO or made mistakes in your own link-building. If you know of bad link-building done on your behalf (e.g., paid posts or paid links that pass PageRank), we recommend that you contact the sites that link to you and try to get links taken off the public web first. You’re also helping to protect your site’s image, since people will no longer find spammy links and jump to conclusions about your website or business. If, despite your best efforts, you're unable to get a few backlinks taken down, that's a good time to use the Disavow Links tool.In general, Google works hard to prevent other webmasters from being able to harm your ranking. However, if you're worried that some backlinks might be affecting your site's reputation, you can use the Disavow Links tool to indicate to Google that those links should be ignored. Again, we build our algorithms with an eye to preventing negative SEO, so the vast majority of webmasters don't need to worry about negative SEO at all.Q: I didn’t create many of the links I’m seeing. Do I still have to do the work to clean up these links?A: Typically not. Google normally gives links appropriate weight, and under normal circumstances you don't need to give Google any additional information about your links. A typical use case for this tool is if you've done link building that violates our quality guidelines, Google has sent you a warning about unnatural links, and despite your best efforts there are some links that you still can't get taken down.Q: I uploaded some good links. How can I undo uploading links by mistake?A: To modify which links you would like to ignore, download the current file of disavowed links, change it to include only links you would like to ignore, and then re-upload the file. Please allow time for the new file to propagate through our crawling/indexing system, which can take several weeks.Q: Should I create a links file as a preventative measure even if I haven’t gotten a notification about unnatural links to my site?A: If your site was affected by the Penguin algorithm update and you believe it might be because you built spammy or low-quality links to your site, you may want to look at your site's backlinks and disavow links that are the result of link schemes that violate Google's guidelines.Q: If I upload a file, do I still need to file a reconsideration request?A: Yes, if you’ve received notice that you have a manual action on your site. The purpose of the Disavow Links tool is to tell Google which links you would like ignored. If you’ve received a message about a manual action on your site, you should clean things up as much as you can (which includes taking down any spammy links you have built on the web). Once you've gotten as many spammy links taken down from the web as possible, you can use the Disavow Links tool to indicate to Google which leftover links you weren't able to take down. Wait for some time to let the disavowed links make their way into our system. Finally, submit a reconsideration request so the manual webspam team can check whether your site is now within Google's quality guidelines, and if so, remove any manual actions from your site.Q: Do I need to disavow links from example.com and example.co.uk if they're the same company?A: Yes. If you want to disavow links from multiple domains, you'll need to add an entry for each domain.Q: What about www.example.com vs. example.com (without the "www")?A: Technically these are different URLs. The disavow links feature tries to be granular. If content that you want to disavow occurs on multiple URLs on a site, you should disavow each URL that has the link that you want to disavow. You can always disavow an entire domain, of course.Q: Can I disavow something.example.com to ignore only links from that subdomain?A: For the most part, yes. For most well-known freehosts (e.g. wordpress.com, blogspot.com, tumblr.com, and many others), disavowing "domain:something.example.com" will disavow links only from that subdomain. If a freehost is very new or rare, we may interpret this as a request to disavow all links from the entire domain. But if you list a subdomain, most of the time we will be able to ignore links only from that subdomain.

Forum discussion at Google Webmaster Help, WebmasterWorld, Google+, Cre8asite Forums, DigitalPoint Forums & Search Engine Watch Forums.




Google's Matt Cutts On Disavow Tool Mistakes



Google's Matt Cutts posted a video the other day explaining the top six or so mistakes that SEOs and webmasters make when using the disavow tool.

By far, the most common mistake is uploading anything but text (TXT) files. Many upload Word documents or Excel files, they should not - you should only upload TXT files.

Here is the video followed by the six most common disavow tool mistakes:




(1) File you upload should be a regular text file only. No syntax, etc. People often upload word docs, excel spreadsheets, etc. Just upload a text file.
(2) Typically, the first attempt by users are to be very specific and fine tuned with their individual urls. Instead use a domain: command and disavow the whole site. That is often better. See our machete story.
(3) Wrong syntax is another common issue, use the right syntax.
(4) Do not write the story on why you are disavowing in the disavow text file. Do that instead in the reconsideration request, not in the text file.
(5) With that, when you do that, they use comment out tags. So don't add lots or any comments, it will increase the chance of errors on Google's parser.
(6) The disavow is not the be all and end all. It will not cure all your URLs. Clean up your links outside of the disavow tool as well, don't just go this route.

Saturday 1 June 2013

Backlink Monitoring: Keeping Track of Your Existing Links

When you've put a lot of effort in acquiring links to your website, you want to make sure these links aren't lost at any time.
When multiple links are removed or changed within a short timeframe this is a strong signal to search engines that these links could have been acquired unnaturally.
Since Penguin, Google has gotten a lot better at finding all similar links. You might lose the value for all of them when only a couple are tagged as obviously unnatural.
new-links-lost-links
Monitoring which links are removed and which links are altered over time allows you to take action before Google does. So how do you effectively keep track of your existing links?

Why Monitor Existing Links?

Sometimes you trade more than just the good content on your website with a link partner. Although natural links should be seen as votes of confidence, sometimes a link is an agreed obligation for business partners, discounts, or other deals.
To see if they keep their end of the bargain it isn't enough to check on them once. Far too often links are removed after a couple of months.
Google spam detection is all about patterns in your link profile. Groups of similar links are seen as degrees of natural behavior and valued as such. When multiple links in a group start to behave less natural, the entire group will be affected negatively.

Automated Link Alerts

Checking 50 links once a month is the most you should be willing to do manually. When you need to monitor over 50 links you're dependent on automated tools that alert you when a link has been changed.
You can choose between various solutions that run as a desktop application or as web-based service. When selecting the right service for you, make sure that it has various options to send you alerts and keep in mind that the more frequently it re-checks links, the better.

What Changes are Important?

The most important change to monitor is losing a link. You need to know if just the link has been removed or the entire page it was on. The latter often happens by accident, but the first requires conscious action from your partner.
It's also important to find out when the specifics of a link are altered. Your link partner might add a nofollow or adds affiliate tracking to the link. All changes to link specifics including anchor text and landing page should be monitored.

Various Tools

If you aren't using expensive tools like Majestic SEO, which has “Lost Links” as just one of their options, there are various alternatives that offer just backlink monitoring. Here are just a couple of solutions:
Majestic SEO
  • Pro: Allows you to dig deep within all your links and you don't have to pre-select which links to monitor.
  • Con: Doesn't have automated alerts and doesn't report on changes, just on lost links within the last six months.
LinkAssistant
  • Pro: Has a lot of additional features to keep track of all link deals.
  • Con: Runs from your local machine and is based on the unwanted practice of link trades.

Linkody
  • Pro: Reports on every change to your link in detail. Linkody re-checks daily and sends automated alerts. It combines various methods of link discovery and can even report on initial placement.
  • Con: Can only be used for backlink monitoring. Comparable functionality is available in complete services like Jetrank and Raven.

Acting on Changed/Lost Links

Once you receive an alert, try to find out what the original deal with that partner was. Contact them as soon as possible to resurrect the link in time before Google flags it.
Although I'm a big fan of naturally acquired links, you need to guard those link-gems you accidentally or deliberately acquired. Backlink monitoring is one of those things too few of us do.

7 Achievable Steps For Great SEO After The Penguin Update

The Penguin update sent a strong message that not knowing SEO basics is going to be dangerous in the future. You have to have the basics down or you could be at risk. Penguin is a signal from Google that these updates are going to continue at a rapid pace and they don't care what color your hat is, it's all about relevance. You need to take a look at every seemingly viable "SEO strategy" with this lens. What you don't know can hurt you. It's not that what you are doing is wrong or bad, the reality is that the march towards relevance is coming faster than ever before. Google doesn't care what used to work, they are determined to provide relevance and that means big changes are the new normal.
eHow / Demand Media after the Panda update
All that said doing great SEO is an achievable goal, make sure you are taking these steps.

1. Understand your link profile

This is essential knowledge post Penguin. The biggest risk factors are a combination of lots of low quality links with targeted anchor text. There seems to be some evidence that there is a new 60% threshold for matching anchor text but don't forget about the future, I recommend at most 2 rankings focused anchor texts out of 10. The key metrics I look at for this are:
  • Anchor text distribution
  • The link type distribution (for example, article, comment, directory, etc.)
  • Domain Authority and Page Authority distributions
The goal here is to find out what is currently going on and where you should be going. Compare your site with the examples below.

Tools for this:

For anchor text Open Site Explorer gives you an immediate snapshot of what's going on while MajesticSEO and Excel can be better at digging into some of the really spammy links.
Distilled Anchor Text
Natural anchor text profile
Great Excel templates for DA/PA analysis
Balsamiq Link Profile
Natural Domain Authority profile
For link type analysis I use Link Detective but it seems to be down at the moment (please come back!).

Link Detective
UNNATURAL link type profile

2. Learn what makes a good link

Great links:
  • Come from respected brands, sites, people and organizations
  • Exist on pages that lots of other sites link to
  • Provide value to the user
  • Are within the content of the page
  • Aren't replicated many times over on the linking site
Those are lofty requirements but there is a lot of evidence that these high value links are really the main drivers of a domain's link authority. At the 1:00 mark Matt Cutts talks about how many links are actually ignored by Google:
That's not to say there isn't wiggle room but the direction of the future is quite clear, you have no control over how Google or Bing values your links and there's plenty of evidence that sometimes they get it wrong. The beauty of getting great links is that they aren't just helping you rank, they are VALUABLE assets for your business SEO value aside. At Distilled this was one of the primary ways we built our business, it's powerful stuff.

3. Map out your crawl path

This is a simple goal but it can be very difficult for larger sites. If it's really complex and hard to figure out then it's going to be hard for Google to crawl. There are few bigger wins in SEO than getting content that wasn't previously being indexed out there working for you.
Crawl Path

Sitemaps unfortunately can only help you so much in terms of getting things indexed. Furthermore, putting the pages that are the most important higher up in the crawl path lets you prioritize which pages get passed the most link authority.

4. Know about every page type and noindex the low value ones

I have never consulted on a website that didn't have duplicate or thin content somewhere. The real issue here is not that duplicate content always causes problems or a penalty but rather if you don't understand the structure of your website you don't know what *could* be wrong. Certainty is a powerful thing, knowing that you can confidently invest in your website is very important.

So how do you do it?

A great place to start is to use Google to break apart the different sections of your site:
  1. Start with a site search in Google site search
  2. Now add on to the search removing one folder or subdomain at a time Subtracting from site search
  3. Compare this number you get to the amount of pages you expect in that section and dig deeper if the number seems high
Note: The number of indexed pages that Google features here can be extremely inaccurate; the core idea is to reveal areas for further investigation. As you go through these searches go deeper into the results with inflated numbers. Duplicate and thin content will often show up after the first 100 results.

5. Almost never change your URLs

It's extremely common to change URLs, reasons like new design, new content management systems, new software, new apps... But this does serious damage and even if you manage it perfectly the 301 redirects cut a small portion of the value of EVERY single link to the page. And no one handles it perfectly. One of my favorite pieces of software Balsamiq has several thousand links and 500+ linking root domains pointed at 404s and blank pages. Balsamiq is so awesome they rank their head terms anyway but until you are Balsamiq cool you might need those links.
Balsamiq links
If you are worried that you have really bad URLs that could be causing problems Dr. Pete has already done a comprehensive analysis of when you should consider changing them. And then you only do it once.

6. Setup SEO monitoring

This is an often overlooked step in the process. As we talked about before if your content isn't up and indexed any SEO work is going to go to waste. Will Critchlow has already done a great job outlining how to monitor your website:
  • Watch for traffic drops with Google Analytics custom alerts
  • Monitor your uptime with services like Pingdom
  • Monitor what pages you noindex with meta tags or robots.txt (you would be shocked how often this happens)
Some more tools to help you keep an eye out for problems:
  • Dave Sottimano's traffic and rankings drop diagnosis tool
  • Google Analytics Debugger
  • The various rank tracking tools
  • SEOmoz's Google Analytics hook formats landing pages sending traffic in an easy graph

7. Embrace inbound marketing

To me inbound marketing is just a logical progression from SEO, thinking about your organic traffic in a vacuum really just doesn't make sense. Dedicate yourself to improving your website for your users and they will reward you, Balsamiq which I mentioned earlier is a perfect example of this. I guarantee you they have done little to no SEO and yet they rank first for their most important keywords and have a Domain Authority of 81. How did they do it? Less features.
balsamiq process
So what does that really mean? Balsamiq had a rigorous dedication to what their customers really wanted. That's really good marketing, smart business and intelligent product design all in one. Remember the future is all about relevance to your users, if you aren't actively seeking this you will get left behind. There is no excuse anymore there are plenty of proven examples of making seemingly boring page types fascinating and engaging.

Want to learn more?

If you need more high impact changes to your SEO check out the topic list for SearchLove San Francisco, it's the first time Distilled is going to be doing a conference on the West Coast.