The Penguin update sent a strong message that not knowing SEO basics is
going to be dangerous in the future. You have to have the basics down
or you could be at risk. Penguin is a signal from Google that these
updates are going to continue at a rapid pace and they don't care what
color your hat is, it's all about relevance. You need to take a look at
every seemingly viable "SEO strategy" with this lens. What you don't
know can hurt you. It's not that what you are doing is wrong or bad, the
reality is that the march towards relevance is coming faster than ever before. Google doesn't care what used to work, they are determined to provide relevance and that means big changes are the new normal.
Sitemaps unfortunately can only help you so much in terms of getting things indexed. Furthermore, putting the pages that are the most important higher up in the crawl path lets you prioritize which pages get passed the most link authority.
If you are worried that you have really bad URLs that could be causing problems Dr. Pete has already done a comprehensive analysis of when you should consider changing them. And then you only do it once.
So what does that really mean? Balsamiq had a rigorous dedication to
what their customers really wanted. That's really good marketing, smart
business and intelligent product design all in one. Remember the future
is all about relevance to your users, if you aren't actively seeking
this you will get left behind. There is no excuse anymore there are
plenty of proven examples of making seemingly boring page types fascinating and engaging.
eHow / Demand Media after the Panda update
All that said doing great SEO is an achievable goal, make sure you are taking these steps.
1. Understand your link profile
This is essential knowledge post Penguin. The biggest risk factors are a combination of lots of low quality links with targeted anchor text. There seems to be some evidence that there is a new 60% threshold for matching anchor text but don't forget about the future, I recommend at most 2 rankings focused anchor texts out of 10. The key metrics I look at for this are:- Anchor text distribution
- The link type distribution (for example, article, comment, directory, etc.)
- Domain Authority and Page Authority distributions
Tools for this:
For anchor text Open Site Explorer gives you an immediate snapshot of what's going on while MajesticSEO and Excel can be better at digging into some of the really spammy links.
Natural anchor text profile
Great Excel templates for DA/PA analysis
Natural Domain Authority profile
For link type analysis I use Link Detective but it seems to be down at the moment (please come back!).
UNNATURAL link type profile
2. Learn what makes a good link
Great links:- Come from respected brands, sites, people and organizations
- Exist on pages that lots of other sites link to
- Provide value to the user
- Are within the content of the page
- Aren't replicated many times over on the linking site
Those are lofty requirements but there is a lot of evidence that these
high value links are really the main drivers of a domain's link
authority. At the 1:00 mark Matt Cutts talks about how many links are
actually ignored by Google:
That's not to say there isn't wiggle room but the direction of the
future is quite clear, you have no control over how Google or Bing
values your links and there's plenty of evidence that sometimes they get
it wrong. The beauty of getting great links is that they aren't just
helping you rank, they are VALUABLE assets for your business SEO value aside. At Distilled this was one of the primary ways we built our business, it's powerful stuff.
3. Map out your crawl path
This is a simple goal but it can be very difficult for larger sites. If it's really complex and hard
to figure out then it's going to be hard for Google to crawl. There are
few bigger wins in SEO than getting content that wasn't previously
being indexed out there working for you.
Sitemaps unfortunately can only help you so much in terms of getting things indexed. Furthermore, putting the pages that are the most important higher up in the crawl path lets you prioritize which pages get passed the most link authority.
4. Know about every page type and noindex the low value ones
I have never consulted on a website that didn't have duplicate or thin content somewhere. The real issue here is not that duplicate content always causes problems or a penalty but rather if you don't understand the structure of your website you don't know what *could* be wrong. Certainty is a powerful thing, knowing that you can confidently invest in your website is very important.So how do you do it?
A great place to start is to use Google to break apart the different sections of your site:- Start with a site search in Google
- Now add on to the search removing one folder or subdomain at a time
- Compare this number you get to the amount of pages you expect in that section and dig deeper if the number seems high
5. Almost never change your URLs
It's extremely common to change URLs, reasons like new design, new content management systems, new software, new apps... But this does serious damage and even if you manage it perfectly the 301 redirects cut a small portion of the value of EVERY single link to the page. And no one handles it perfectly. One of my favorite pieces of software Balsamiq has several thousand links and 500+ linking root domains pointed at 404s and blank pages. Balsamiq is so awesome they rank their head terms anyway but until you are Balsamiq cool you might need those links.6. Setup SEO monitoring
This is an often overlooked step in the process. As we talked about before if your content isn't up and indexed any SEO work is going to go to waste. Will Critchlow has already done a great job outlining how to monitor your website:- Watch for traffic drops with Google Analytics custom alerts
- Monitor your uptime with services like Pingdom
- Monitor what pages you noindex with meta tags or robots.txt (you would be shocked how often this happens)
- Dave Sottimano's traffic and rankings drop diagnosis tool
- Google Analytics Debugger
- The various rank tracking tools
- SEOmoz's Google Analytics hook formats landing pages sending traffic in an easy graph
After penguine update, google is more strict than previous and I think this post is really helpful to know that. I use my seo service carefully and use online update seo tools that matches google's rules. At present I can't but rely on Colibri seo tool which is user friendly and affordable. It's available on colibritool.com
ReplyDeleteI'm glad I found this web site, I couldn't find any knowledge on this matter prior to.Also operate a site and if you are ever interested in doing some visitor writing for me if possible feel free to let me know, im always look for people to check out my web site. low OBL
ReplyDelete