Friday 29 March 2013

The Actual Answers To Every Matt Cutts Video


The Actual Answers To Every Matt Cutts Video



http://www.theshortcutts.com/

Survival Tips For Becoming An In-House SEO


As my friends know, in January, I became an in-house SEO. Now that I have a few weeks under my belt, I thought I’d discuss some of the differences between agency SEO and in-house SEO, the type of traits it takes to succeed in-house, and some of the traps you can land in.

Agency vs. In-house

Let’s start with some of the differences between agency SEO and in-house SEO.
In-house Has No Hamster Wheel
In an agency, it’s inevitable. While waiting for clients to implement your recommendations, you hit roadblocks and dead ends.

Is this the perfect workout for some SEO agency employees? Image Credit: PBoyd04

Except, you have a minimum number of billable hours and a table declaring how many recommendations you must produce. You cannot start a new project without a client meeting. The next meeting is scheduled in two weeks, when you must present your recommendations from today. Frustrated, you look for any suggestion you can give, whether it will actually help or not.
As an in-house SEO, I can investigate and negotiate roadblocks or I can find something else to work on, something that will create value.
No Billable Hours
Saints be praised; no more billable hours. I don’t care what tool or stopwatch you use, tracking billable hours is an impossible farce created by sick, sadistic minds. It always comes down to imperfect memories and estimates.

In-house SEOs do not have to track billable hours
Image credit: Zak Greant

In-House Has More Distractions
At the agency, the CEO and I were the go-to people for all things search and social. And, because he was the CEO, I got most of the questions. That meant I worked on every client whether I was on the team or not.
Imagine my surprise. As an in-house SEO, I get more questions from colleagues and interruptions. It makes sense. At the agency, everyone from the receptionist to the CEO lives and breathes search and social. They know the 80%. They need help with the difficult 20%.
In-house, I have to educate and evangelize from the ground up. People here know organic search and social are important, but they are just learning how to implement it. Unlike PPC landing pages, which are largely isolated, every brochure page, case study, blog post… every indexed page affects SEO.
In-house, You Can Walk Down The Hall & Face-to-Face
When you hit roadblocks with agency clients, you’re at their mercy. Whether it’s a communications slowdown, a technical hurdle, or defensive posturing, you have to wait for the client to implement or give you the go-ahead.
In-house, I can simply leave my desk and visit my co-workers. I don’t always get the answers I want. However, I can learn exactly where each problem lies and work with people who understand and share my goals.

SEO Communications

Whether you work as an SEO at an agency or inside a business with more than a few employees, there are many things you must be able to communicate, including the following:
Communicate Clearly Without Jargon
When you spend your days with co-workers inside agency walls or with colleagues at conferences, it’s easy to communicate using terms the uninitiated will not understand. A wise professor told me the true meaning of knowledge is the ability to teach. Know your plain language definitions and practice using them. Let colleagues know you want them to ask for an explanation when you say something they do not understand.
Be Disciplined & Concise
Similarly, you cannot monopolize everyone’s time by droning on and on. If you cannot be clear, persuasive, and brief at the same time, you will never succeed as an in-house SEO.

Discipline
Image Credit: Grotuk

Stay Resolute, Politically Perceptive & Judicious
On day one, my CEO told me our IT director’s prime directive, don’t let the app crash. Smartsheet’s reliability is as important as its features. Because our website and application are on the same domain, certain things require heavy testing or are simply off limits. At the agency, this would have been the end of the discussion. In-house, I can respect these challenges and I get to work with the IT director and application developers to overcome them.
Select High Value Activities, Then Stay Organized & On Track
When I arrived at Smartsheet, I saw many different things I could work on. Throughout my first month, I seemed to explore or pursue a different opportunity every day. During my second month, I began to settle down and isolate things that I can accomplish and will lead to the greatest returns.
Whether your SEO department is just you or a team, you can only accomplish so many projects within so much time. Make the most of what you have.
Knowledge You Can Share
As an in-house SEO, you will enjoy sharing your knowledge. Keep in mind, there are some things worth sharing right away.

The SEO Arms Race

It isn’t enough to optimize your content and earn links or authority. SEO is an arms race. You have to catch-up with the leading keyword competitors and pass them at the same time they accelerate their own content and authority building. It’s not enough to be as good. You must become better, faster and stronger.
The search engines are in an arms race of their own, with Spammers. While the search engines get better at overcoming Spam, the Spammers keep finding new ways to manipulate the search engines. This is an arms race no legitimate business should engage in. Do you want to be the SEO who burns your employer’s domain?
Site Optimization Vs. Page Optimization
Make sure your marketing department understands site optimization concepts like internal link architecture, domain authority, and supporting content.
In a competitive space, you’re not going to publish one page about red widgets and leap into Google’s top ten. It can take several supporting documents, many with their own link and citation attraction qualities. Especially if your company is used to paid search, they may not realize this.
Not Every Page Is An SEO Landing Page
Does everyone understand that while you can optimize any page for either short- or long-tail keywords, not every page is going to rank or drive traffic? It is usually better to keyword optimize your top-level and high-value content, then skip the rest and move on to publishing new content that will earn links and authority.
For the sake of clarity, you will still need to clear-up any technical SEO errors and optimize your internal linking. Also, just because you’re not optimizing a page does not mean you won’t edit it (for example, to add an anchor text link to a real SEO target page).
At some point, after you develop a track record of SEO wins, start revisiting your lower-value pages and check if you can optimize them for long-tail keywords. This is a great task for interns or new staff members who need to build-up their familiarity with your website.
Different Types Of Content Have Different SEO Strength
This is a favorite of mine. Different Web assets or documents serve different purposes and have their own unique SEO strength.
For example, a case study may be ideal to launch a press release campaign around while it is unlikely to go viral on Twitter or earn many links. Live blogging from a popular conference can earn lots of mentions and links, but may not convert many sign-ups or sales. Prepare your company to create many types of content for many different purposes.
This is where I insert the flywheel analogy. The more popular your brand, blog, and social media assets are, the more likely they are to help SEO in different ways. You have to earn that reputation. On a popular blog, a post might go viral, while on a lesser known blog, the exact same post will just sit there. Get your flywheel spinning.
Playing Within Your Competitive Footprint
Be sure your co-workers understand what keywords your company can rank for now and which ones are within reach. There is a reason so few SEO professionals are trying to rank for the keyword SEO, we know it’s a waste of time. Create short-term, long-term, and big, hairy audacious goals. Be assertive and aggressive, but do not create unreasonable expectations.
Here is one of my tips. When I do keyword research, I select based upon the phrase match numbers, but I always report the exact match numbers.
Optimization Vs. Over Optimization
Warn your coworkers about over optimization. There are two good reasons for this. First, you don’t want to incur an over-optimization penalty from the search engines.
Another reason is to give you some breathing room. Everywhere you go, there will be that person, the one who says things like, “You did this over there, so why haven’t you done it over here?” As Bones McCoy might say, “For God sakes Jim, you’re a human being, not a machine.”
Even if you keep checklists (you should), you will not optimize every page exactly the same. SEO is a craft, part science and part art. The more you do it, the better you get, and the more intuitive it becomes. Sometimes, it’s just nice to be able to say in a calm, reassured voice, “I didn’t want to over optimize the page, but I can reconsider that.”
Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.
Related Topics: In House Search Marketing

About The Author: is the in-house SEO lead at Smartsheet, a leading online project management and collaboration application. As an SEO and social media strategist, Tom has helped some of the Internet's largest brands grow their inbound traffic. You can read more of Tom's musings at http://inboundbound.com.

5 Simple Ways To Debug Your Google Analytics Installation


As you might guess, we QA a lot of Google Analytics installs. It is often a maddening task that makes you want to “gaq.”
However, there are some nice tools that go a long way toward making life easier.
If you are questioning the data you’re getting out of Google Analytics; if your e-commerce reporting doesn’t match your sales; if you really thought there’d be more downloads of your whitepaper on the fonts used in movie credits; then you can use these tools to find out if Google Analytics is broken or if the error lies somewhere else.
This is how we do it.

Things That Go Wrong

Typos

Sometimes we just get things wrong. The Google Analytics developers site references some of the most Common Tracking Code Errors.

Fancy Quotes

We do a lot of our data collection in Microsoft Word. This means the IT departments of our clients are cutting and pasting our code from Word.
If you know what I’m about to say, you’ll have an evil grin on your face.
In its effort to be helpful, Word likes to add fancy quotes to everything you do. Microsoft calls these “Smart Quotes.”
We call them “Fart Quotes” as in “brain fart,” and they turn this:
'product category'
into this:
&rquo;product category&lquo;
You can turn them off after a Dante-like descent into the nine rings of Word configuration, as the MalekTips blog will demonstrate.

Wrong Google Analytics Account

We find all kinds of strange configurations when we start optimizing a website. Often, we’ll be granted access to a Google Analytics account, only to find out that a completely different property ID (as defined by the “UA-#######-#”) is being used.

mixed-accounts Viewing Page Source: Two accounts on one page.

Other Google Analytics Accounts

If you use a content management system like WordPress, you might find that some of your plugins also use Google Analytics. The most common one we find is the Disqus comments plugin, which you can only detect with some of the tools I introduce below.

View Source

The most common place to start for debugging your Google Analytics Tracking code is to simply open a key page and view the pages source.
In almost any browser, you simply right click on the page and select “View page source”:
view page source

Unfortunately, Chrome’s “Translate to English” option doesn’t help in this scenario.
Once you have the page open, you can search the page just like any webpage (Ctrl+F or F3) for some common Google Analytics strings.
  • Searching for “Google” will find the domain that the tracking code uses to download the Javascript files. It will also find all of your AdWords-related tags and code.
  • Search for “gaq” to find a common variable found in Google Analytics implementations.
  • Searching for “UA-” will help you find out the Property ID found in the tracking code.
You should try this on the following pages:
  • Your home page
  • Your PPC landing pages
  • Your “Thank You” or “Receipt” pages
  • Your shopping cart, registration process, or subscription process

Ghostery

An “easier” way to see if Google Analytics is on a page is to use the plugin Ghostery. There is a version for all of the popular browsers.
Ghostery Ghostery reveals that Google Analytics is on the page. It also reveals other tools that are installed, making it a great way to spy on your competitors.

In the image above, we can see that Google Analytics is installed on the page, but that doesn’t mean that the tool is installed correctly. We can also see that this site has CrazyEgg and Optimizely installed, two tools of the conversion specialist.
If you find these on a competitor’s site, be very afraid.

Generate Data & Look In Google Analytics

Once you feel that you’ve got Google Analytics installed, you can use the tried and sometimes-true method of simply logging in to Google Analytics and seeing if it is reporting data.
If you’re the only visitor to your site, this just might work. Otherwise, keep reading.

Firefox Debugger

I’ve just discovered this debugger for Firefox by Keith Clark, called GA Debugger.
I like the simplicity of this plugin. It shows what Property IDs you come across, which pageviews get generated, Events, Custom Variables and more, even if you navigate across sites. For those new to Google Analytics, you will like the hierarchy of the listing. It shows you how Events and Custom Variables relate to Pageviews in the system.
It doesn’t let you save a log of the data you collect, however, and this can prevent more detailed analysis.
GA Debugger Screen

Google Analytics Debugger For Chrome

Google’s debugger is only available as a plugin for the Chrome browser, but it provides the most detailed information of any of the tools I’ve found.
Google provides a debugging version of the Google Analytics Javascript code that generates messages for you as it works. This allows you to see exactly what is being written to your Google Analytics database, and what is not.
debugger-icon
Install the plugin and an icon appears in your extensions list bar. This extension works in conjunction with a built-in feature of the Chrome browser, called the JavaScript console, which you can open by clicking the “Customize and Control Google Chrome” button, and selecting the “Tools” menu. See the following figure.
Javascript Console, Column by Brian Massey
The information you collect is substantial. Every call is logged along with every parameter. The data collected by Google Analytics is also logged. You can see the Property ID, the URL of the pageview, the domain and referring URL.
You can also QA campaign information, such as source, medium, content and term.
You will be given information on Events, including Name, Type, Label and Value.
Your Custom Variables will be listed, complete with Label and Scope.
It’ll disclose what you are reporting to GA Ecommerce Tracking.
Google Debugger Messages, column by Brian Massey
Here’s a helpful tip: If you right-click in the console area, you can select the “Preserve log upon navigation” option, which keeps the console area from being cleared with each new page.
Preserve log upon navigation, column by Brian Massey
Now, you can cut and paste the contents into a text editor and use filtering and regular expressions to zero-in on just the information you want.
But, we’ll save that kind of analysis for another column.

Best Free Website Analytics Tools



Clicky

Clicky
Clicky prides itself on providing real time analytics. The UI is very clean and functional, and there is also a dedicated iPhone version.

Google Analytics

Google Analytics
Google Analytics is probably the most popular free analytics tool available. One of my favorite features is it’s custom reporting.

Reinvigorate

Reinvigorate
Reinvigorate also provides real-time stat tracking and can alert you when a visitor performs a particular action on your site. It also features heat map technology that lets you see where visitors are clicking.

Piwik

Piwik
Piwik is open source and is built with PHP and MySQL. To use it, you have to install it on your own server, which is a simple process and only takes about 5 minutes.

Yahoo! Web Analytics

Yahoo! Web Analytics
Yahoo! Web Analytics is a free full featured enterprise analytics solution with powerful and flexible dashboards, segmentation tools, and campaign management features.

WordPress.com Stats

WordPress.com Stats
If your site runs on WordPress and your not crazy about being overwhelmed with too many features, then WordPress.com Stats might be right for you.

Woopra

Woopra
Woopra claims to be the world’s most comprehensive, information rich, easy to use, real-time Web tracking and analysis application. Judging by the quality of it’s user interface, they might be right.

FireStats

FireStats
FireStats doesn’t feature a lot of fancy graphs and charts, but some may find this refreshing.

GoingUp

GoingUp
With an AJAX-rich interface, GoingUp! combines powerful web-analytics with top notch SEO tools.

Mint

Mint
Okay, Mint isn’t exactly free. However, for only a flat fee of $30 dollars and considering all of it’s features, it might as well be free.

Blog Tracker

Blog Tracker
Blog Tracker is a light weight analytics tool for blogs.

What is Geo SEO?


  • Search Engine Optimizatio

  • (SEO) is the chief set of processes in Internet Marketing designed towards achieving consistently improving organic rankings for targeted keywords.  All SEO is focused entirely on Google, Yahoo and Bing as combined over 97% of all organic searches originate.
An important component of SEO is Geo-SEO; as the name indicates this is that part of SEO that is focused on geography.  Keywords that are targeted on geography are called geocentric keywords.  While an Internet Marketing Company wants to be found for “Internet Marketing Services,” they may well wish to be found for “Internet Marketing Services New Jersey.”  Many searchers want to find someone who is local to them as this adds a bit of confidence and peace of mind to have a company with a local presence.
There are scads of businesses that are ONLY interested in being found locally; fortunately their potential customers / clientele are equally interested in finding local businesses.  Just consider this relatively short list of businesses primarily concerned with Geo-SEO:
  1. Contractors (Builders, Electricians, Plumbers, HVAC, Roofers, Landscapers, etc).
  2. Professionals (Physicians, Attorneys, CPA’s)
  3. Retail Stores
  4. Restaurants
  5. Entertainment
Bottom line is that local businesses want to be found by local searchers and local searchers want to find local businesses.  The searchers are using geocentric keywords because they don’t want non-local results.  This requires local businesses to focus on Geo-SEO so that they will meet their potential customers at the common meeting point – page 1 of a targeted search.
Geo-SEO Considerations
  1. Keyword research with the addition of local addenda i.e. city, town, state, zip code, region, neighborhood.
  2. Onsite Optimization to index page content with full representations of text that fully describe, generally and specifically, the total targeted geographic area.
  3. Offsite Optimization to generate content that is indexed on the site targeting geocentric keywords and also syndicating the content to high value content sites to develop geocentric Backlinks.
  4. Google Maps – registration of your website with Google Maps to be included in the special section that provides a map with location pins with listings that are registered with Google Maps.  This is not technically SEO in the sense that you don’t need to have a website to register and be found on these maps.
  5. Geocentric Sub-domains – i.e. atlanta.yourdomain.com to isolate and segregate your Geo-SEO results.  This may be used for businesses with multiple locations all seeking website visitors.
All of the rules and guidelines of SEO still apply to Geo-SEO.  Onsite optimization and offsite optimization techniques remain the same.  Content remains essential.  The only difference really is the addition of egocentricity; from keywords to metadata to titles and descriptions and, of course, your target audience.
Are you looking to target specific geography?  Are there any complicated issues to deal with?  Just contact us for a free initial consultation.

What is Geo SEO?


Introduction

SEO, or Search Engine Optimization is the name given to techniques which marketing professionals can employ to improve their organic rankings for specific target keywords on search engines such as Google, Yahoo and Bing.
One important aspect of the search engine optimization process is Geo-SEO. As can be inferred from the term, Geo-SEO is that component of SEO which is focused on geography, and it is particularly important for multi-site brands.
Businesses following Geo-SEO protocols encourage the use of ‘geocentric’ keywords that are specific to a geographical location. As much as an internet marketing company will appreciate being recognised for the search term “Digital Marketing Services”, if they are located in New York being found for the query “Internet Marketing Services New York” will be much more valuable.

Leading Industries Using Geo-SEO

There are countless businesses that prefer being found locally over search engines rather than globally. The fact that their existing clientele as well as potential customers prefer as much to find local businesses helps them in their prospects. Here is a list of businesses that would prefer Geo-SEO over the conventional SEO practices:
•  Retail stores, discount stores, and other multi-site retail brands
•  Restaurants, cafes and fast food outlets
•  Tradesman such as electricians, plumbers and builders
•  Certified professionals such as doctors, attorneys and accountants
•  Entertainment, movie theaters, theme parks, zoos, recreational centers and arcades
It is crucial for above mentioned businesses to make sure that online search engines redirect queries to the geographical location where their business earns its livelihood. Therefore, they are careful to choose ‘geocentric’ keywords as it discourages the search engine to return non-local results.

Geo-SEO Considerations

Using geocentric keywords allows local businesses to increase the chances of directing their potential customers to the established rendezvous point. Here are some important considerations that local businesses should make when using Geo-SEO:
Keyword research – Careful research should be carried out on keywords to make sure that they are novel and unique. Keywords that are fairly identical to those already used by other businesses will not help the business much. For best results, businesses can add local details such as postcode, region, state, town or city.
Onsite Optimization – This process helps in indexing the content that businesses put on their web pages.
Offsite Optimization – It is equally essential for businesses to optimize offsite content in order to generate indexed content and target geocentric keywords. Offsite optimization also syndicates webpage content to other higher ranked content sites and this helps to establish geocentric backlinks.
Google Maps – Google maps is a useful online resource. It pays to register your business’ online portfolio with it so that it is listed in the premium section. This provides search engine users with a geographical tag on your business’ location.
eHound has developed a tool specifically for this purpose, allowing clients to download a file from their account that has been pre-formatted for uploading to Google Maps.
Geocentric Sub-domains – Choosing the sub domain of your business or ‘geocentralizing’ it also works wonders as it effective isolates your business form Geo-SEO search results. You may have come across sites with sub-domains like pittsburgh.holidayinn.com. This is an effective way of segregating the search results and is popular among businesses with branches.

SEO or Geo-SEO

There is not much difference in traditional search engine optimization practices and geocentric search engine optimization guidelines; all procedures and guidelines used in conventional search engine optimization practices can be applied to Geo-SEO. For instance, onsite and offsite optimization protocols have virtually been borrowed from search engine optimization practice. Also, the content still remains as essential in Geo-SEO as it is in SEO. The only disparity that can be observed in the two, and that gives Geo-SEO its name, is the introduction of geocentric keywords, metadata, titles and descriptions. This is where the main focus of the business shifts from targeting internet users throughout the globe as potential customers to focusing on a more realistic and achievable target market.

Read More :

Thursday 28 March 2013

Free Website Security Check Tools – Online


Norton Safe Web, from Symantec – So, how can you find out if a Web site is a safety risk before you visit it? Norton Safe Web is a new reputation service from Symantec. Our servers analyze Web sites to see how they will affect you and your computer.
McAfee SiteAdvisor Software – Website Safety Rating – Tests websites for spyware, spam and scams so you can search, surf and shop more safely.
Google Safe Browsing diagnostics – Google uses automatic algorithms and user feedback to compile lists of sites that may be dangerous. Just change the name of the site in the above URL.
WOT Web of Trust – Check the reputation rating of any Website.
AVG Online Web Page Scanner – lets you check the safety of individual web pages you are about to visit. LinkScanner will examine the web page in real time to see whether it’s hiding any suspicious downloads.
Sucuri Security Scanner – This scanner will alert you if it finds any Malware, spam, security issues.
Unmask Parasites – a simple online web site security service that helps reveal _hidden_illicit content (parasites) that hackers insert into benign web pages using various security holes.
BrightCloud – Content, reputation and threat analysis on URL or IP.
PhishTank – PhishTank is a free community site where anyone can submit, verify, track and share phishing data. Enter the URL into the “Is it a phish?” field on the PhishTank to check an individual URL against the PT database.
URLVoid.com BETA – Check Reputation of Domains and Subdomains. Scan Websites for Exploits, Malware and other Malicious Threats.
Trend Micro Site Safety Center – Trend Micro Site Safety Center
Cisco IronPort SenderBase Security Network – Web and Email Reputation Look Up – IP address, URI or Domain based.
Wepawet – runs various analyses on the URLs or files that you submit. At the end of the analysis phase, it tells you whether the resource is malicious or benign and provides you with information that helps you understand why it was classified in a way or the other. wepawet does not just tell you that a resource is malicious, it also shows you the exact vulnerability (or, more likely, the vulnerabilities) that are exploited during an attack.
Qualys Free Scan – allows you to quickly and accurately scan your server for thousands of vulnerabilities that could be exploited by an attacker. If vulnerabilities exist on the IP address provided, FreeScan will find them and provide detailed information on each risk – including its severity, associated threat, and potential impact. It even provides links to give you more information about the vulnerability and how to correct it.
TrustedSource – Internet reputation system – McAfee TrustedSource is the world’s largest Mail, Web, and Network reputation system, proactively identifying senders/hosts of spam, phishing, and malware attacks. It allows you to enter IP address, domain name or URL to check reputation/traffic patterns
hpHosts Online – hpHosts is a community managed and maintained hosts file that allows an additional layer of protection against access to ad, tracking and malicious websites. This database has been created to allow simple, and quick confirmation of a site’s listing in the hpHosts HOSTS file.
urlQuery – Free online URL scanner – urlQuery.net is a service for detecting and analyzing web-based malware. It provides detailed information about the actions a browser takes while visiting a site and presents the information for further analysis.
Dasient Web Anti-Malware (WAM) – Dasient’s Web Anti-Malware (WAM) solution consists of 3 services. Blacklist Monitoring frequently checks a customer’s website against a variety of blacklists. If the website appears on a blacklist, the customer receives an instant alert. The customer can subsequently return to dasient.com and diagnose any problems with the blacklisted site. Malware Monitoring periodically scans a customer’s website for malware infections. If Dasient detects that a customer’s website has been infected, the customer receives an immediate alert with diagnostic information to remove the infection. Quarantining automatically quarantines a malware infection discovered by Dasient’s Malware Monitoring system. Dasient’s WAM Quarantining service leverages a web server module that is installed by the customer (or its web hosting provider).
SiteTruth site rating – search, with less evil. – SiteTruth exists to solve one of the Web’s biggest problems – unidentified, and possibly fake, on-line businesses.
Zscaler Zulu URL Risk Analyzer – Zulu is a dynamic risk scoring engine for web based content. For a given URL, Zulu will retrieve the content and apply a variety of checks in three different categories.
Find Parasites – This service will scan the URL you insert in the form below and will output all the live links, iframes and external scripts found. You can use this service to analyze your website and see if there are unknown iframes or links that point to unknown domains.
OnlineLinkScan – Protection through early detection by scanning for harmful threats hidden behind innocuous looking links such as 301/302 header redirects is what Online Link Scan has come up with. It allows you to scan for suspicious links that might gets infected with viruses, trojan horses, spyware and other malware.
URL & Link Scanner – Scan URLs for malicious code – Scan URLs with Multiple Antivirus Engines
Online Web Safety Scan – Online Web Safety Scan will inspect the URL of the site or web page you want to visit in real-time for whether it is hiding any exploit code and, if so, what exploit.
Web-sniffer – View request and response header of a HTTP connection and HTML source without actually visiting the Website in your browser.
vURL Online webpage dissection service – Quickly and safely dissect malicious or suspect websites. This service is completely free and allows you to view the source code within a webpage without your having to visit the site itself.
Browser Defender – Browser Defender detects potentially unsafe sites and warns you about them.
eval gzinflate base64_decode Online Decode Tool – eval gzinflate base64_decode Online Decode Tool
php $o="encrypted text" Decoder – php $o="encrypted text" Decoder
php $_F=__FILE__;$_X= Byterun Decoder – php $_F=__FILE__;$_X= Byterun Decoder
gred – gred is a free web security service that can help you determine whether the web site is safe or warning. Unlike traditional tools, gred does not rely on a pre-determined list of unsafe URL list (URL blacklist) since content of web sites can change anytime.
Malware Database – abuse.ch Malware Database (AMaDa)- Search for a Domain name, IP address or MD5 hash
Threatlog.com – logs malicious domains that contain malicious content, browser exploits, used for phishing or for scams.
Web Security Guard Websites Databases – The Web Security Guard website database includes information about 1,000’s of websites, user ratings and reviews. Our team of analysts extends the Web Security Guard database every day to ensure effective protection for your computer and privacy.
ScanURL.net – Website/URL/Link Scanner Safety Check for Phishing, Malware, Viruses
gamasec – Free Blacklist Checker

Website Security Check Tools – Download


Acunetix Web Vulnerability Scanner – Hackers are on the lookout for Cross Site Scripting (XSS) vulnerabilities in YOUR web applications: Shopping carts, forms, login pages, dynamic content are easy targets. Beat them to it and scan your web applications with Acunetix Web Vulnerability Scanner. It will chart out your website and identify Cross Site Scripting (XSS) Vulnerabilities.
Web Site Security Audit – WSSA – examines your website pages, applications and web servers to find security weaknesses and vulnerabilities that would give hackers an opportunity to do damage.
Nikto – an Open Source (GPL) web server scanner which performs comprehensive tests against web servers for multiple items, including over 3500 potentially dangerous files/CGIs, versions on over 900 servers, and version specific problems on over 250 servers. Scan items and plugins are frequently updated and can be automatically updated (if desired).
Wikto – Wikto is Nikto for Windows – but with a couple of fancy extra features including Fuzzy logic error code checking, a back-end miner, Google assisted directory mining and real time HTTP request/response monitoring. Wikto is coded in C# and requires the .NET framework.
iScanner – Remove website malwares, web pages viruses and malicious codes – a free open source tool lets you detect and remove malicious codes and web pages malwares from your website easily and automatically. iScanner will not only show you the infected files in your server but it’s also able to clean these files by removing the malware code ONLY from the infected files.

Wednesday 27 March 2013

[Max field length is unknown] Craigslist Asked To Google, Why Stopped Their Website To Indexing?

Google Stops Indexing Craigslist; Matt Cutts Fixes
Craigslist and Google
Tempest Nathan posted news in HackerNews thread and he indicates a blog post in which he mentioned about Craigslist. He said that Google has stopped indexing Craigslist.

I don’t think that it was correct, Google did halt indexing Craigslist. But why?

If Cragslist is doing spam in Google then it’s correct but did Craigslist spam Google? Did they break the Google webmaster guidelines? Did they add the noindex directive to their pages? Nah.

It was technical crotchet

The head of search spam, described at the HackerNews thread telling that they are solving the problem on Google’s side but this is what technically occurred.
To understand what happened, you need to know about the “Expires” HTTP header and Google’s “unavailable_after” extension to the Robots Exclusion Protocol. As you can see at http://googleblog.blogspot.com/2007/07/robots-exclusion-protocol-now-with-even.html , Google’s “unavailable_after” lets a website say “after date X, remove this page from Google’s main web search results.” In contrast, the “Expires” HTTP header relates to caching, and gives the date when a page is considered stale.

A few years ago, users were complaining that Google was returning pages from Craigslist that were defunct or where the offer had expired a long time ago. And at the time, Craigslist was using the “Expires” HTTP header as if it were “unavailable_after”–that is, the Expires header was describing when the listing on Craigslist was obsolete and shouldn’t be shown to users. We ended up writing an algorithm for sites that appeared to be using the Expires header (instead of “unavailable_after”) to try to list when content was defunct and shouldn’t be shown anymore.

You might be able to see where this is going. Not too long ago, Craigslist changed how they generated the “Expires” HTTP header. It looks like they moved to the traditional interpretation of Expires for caching, and our indexing system didn’t notice. We’re in the process of fixing this, and I expect it to be fixed pretty quickly. The indexing team has already corrected this, so now it’s just a matter of re-crawling Craigslist over the next few days.

So we were trying to go the extra mile to help users not see defunct pages, but that caused an issue when Craigslist changed how they used the “Expires” HTTP header. It sounded like you preferred Google’s Custom Search API over Bing’s so it should be safe to switch back to Google if you want. Thanks again for pointing this out

Read More URL:
http://semandseo.blogspot.in/2013/03/google-stops-indexing-craigslist.html

Google Penguin Update Recovery: Matt Cutts Says Look Out These Two Videos



Suppose what: in both videos, he speaks about Google’s standard guidelines. That is your come back manual, as far as Google is concerned.

How to recover penalized websites?

If you want to recover from penalized website then you should work on following:

1) Replace your existing content with unique, relevant, quality and user friendly content.
2) If you change URL name then it will be good.
2) Upload again on server.
3) and then start SEO for that website.

I hope it will be beneficial for you.


More Information to recover penalized website:

http://blog.eukhost.com/webhosting/avoid-recover-from-google-penalty/
http://www.ethinos.com/blog/2012/11/01/tips-to-recover-your-penalized-websites-rankings/

Thursday 21 March 2013

SEO Tips 2013

These are the useful seo tips for 20123.
1. Quality Content
Although you believe or don’t, quality content is the most important thing for SEO. Even though you do everything about SEO and if you don’t have quality content you can’t get a good search engine rank.
2. Important HTML Tags
Few years back, Meta description and Meta keyword tags were very important for SEO, but now major search engines like Google don’t care Meta keywords.
I can mention that Title tag, Meta description tag, Image alt tag, H1 tag and H2 tag are the most important tags for SEO.
If your blogging platform is WordPress you can use Yoast plugin as a SEO plugin.
3. Post Length
Before panda and Penguin updates, writing five posts each of hundred words better than writing a post with five hundred words, but now it is different as writing a post with five hundred words is more worth than writing five posts with hundred words.
You should write 300+ words posts and it is better if you can write 600-2000 words posts. But don’t forget that quality is better than quantity. So don’t write useless sentences to increase word count.
4. Pay attention about other search engines
There is no doubt that Google is the best search engine. Almost every blogger tries to optimize posts for only Google, they don’t think about other search engine like Yahoo or Bing.
Every search engine has their own algorithm to index web pages. So, do a small research about how to optimize posts for yahoo and Bing. If you studied well, you can easily drive more traffic via Yahoo and Bing because many bloggers don’t try to optimize posts for Yahoo and Bing (Never forget Google).
5. Link Building
Link building is the main off page optimization technique. When creating backlinks, pay attention about the following points.
1. Always try to create quality backlinks.
2. Don’t use any script or software to create backlinks (A quality backlink is more powerful than 100 bad backlinks).
3. Create both no-follow and do-follow backlinks(Don’t create only do-follow backlinks because Google might take it is unnatural)
4. Don’t use any spam method to create backlinks.   

Friday 15 March 2013

On-Page SEO Best Practices in 2013: 7 Rules of the Game


By now, I’m sure you’ve heard enough about on-page optimization to last a lifetime. I don’t want to repeat the same mantras you’ve been hearing since last year. Yes, on-page SEO has become more important (I can hardly remember a time when it wasn’t), and yes, on-page SEO can make or break your chances at ranking high on Google SERPs. But what has changed is the way we perceive and behave toward on-page SEO.
Most SEOs tend to think of on-page optimization as a very specific technical influx of code. You know the drill: meta tags, canonical URLs, alt tags, proper encoding, well-crafted, character-limit-abiding title tags, etc.
Those are the basics. And at this point, they are very old-school. They continue to appear on the on-page SEO checklist, but you and I know that the whole demography of SEO has changed vastly, even though the basic premise has remained the same. Because of that change, the way you perceive on-page SEO has to adjust as well. That’s what we’re going to look at now.
On Page SEO: The Foundation
If your website isn’t properly optimized on-page, your efforts off the website (link building, content marketing, social media) probably won’t yield substantial results. Not that they won’t generate anything at all, but more than half your efforts may end up going down the drain.
There’s no clear rule book that says: do X, Y, and Z in on-page optimization and your rank will rise by A, B, or C. On-page optimization is based on tests, analytics and errors. You learn more about it by discovering what doesn’t work than what does.
But of all the things to keep in mind, there’s this: If you don’t take care of your on-page SEO, you’re likely going to fall or stay behind: in rankings, in conversions, and in ROI.
Why The Fuss?
But first let’s clear this one up: Why the fuss about on-page SEO? After all, there’s a ton of material available about it already. Many experts have written well about it.
The changing demography of search engine algorithms has altered the factors playing in to how one chooses to perform SEO. You can no longer think in terms of keywords and inbound links alone. Similarly, you can no longer think in terms of the meta and alt tags alone (yes, this includes the title tag, too).
On-page SEO isn’t just about how your site is coded. It’s also about how your site looks bare-bones (the robot view), and how your website responds to different screens. It includes load times and authority. And with the direction that Google is headed in 2013 and beyond, it’s clear that on-page elements and off-page elements must line up and agree with each other in a natural, clear, organic manner. That’s why we need to reevaluate on-page SEO a little more carefully.
1. Meta Tags Are Just the Beginning
We’ve known and used meta tags since their arrival. The meta “keyword” tag is long-gone, as an SEO ranking factor, but a lot of heat has been generated in discussions about the utility of meta description tags from an SEO point-of-view.
More significantly than SEO ranking factors, is the fact that meta description tags provide an opportunity to affect how your website is displayed in search results. A great meta description tag can get your result clicked before the guy ranking above you. It’s still good practice to use keywords when you can, along with geographic identifiers (when applicable), but first and foremost should be the intent to attract clicks from humans.
2. Canonical, Duplicate, Broken Links, etc.
Google’s robots have become very smart, to the point where broken links and duplicate pages raise red flags faster than a bullet. That is precisely why you’ll find canonical links (and their corresponding codes) to be highly important.
Broken links and dupes aren’t just anti-SEO. They are anti-user too. What’s your first reaction when you click on a link that just shows a page error?
3. The Robot’s Point of View
Text remains the most important part of any website even today. While Google does rank some videos and media higher than others for certain keywords, well-formatted and content-rich websites still rule the roost.
To get a view of how your website looks to the crawlers, you can disable the javascript and images (under Preferences/Settings of your browser) and take a look at the resulting page.
Though not totally accurate, the result is pretty much how your website looks to the crawler. Now, verify all the items on the following checklist:
  • Is your logo showing up as text?
  • Is the navigation working correctly? Does it break?
  • Is the main content of your page showing up right after the navigation?
  • Are there any hidden elements that show up when JS is disabled?
  • Is the content formatted properly?
  • Are all other pieces of the page (ads, banner images, sign-up forms, links, etc.) showing up after the main content?
The basic idea is to make sure the main content (the part you want Google to note) comes as early as possible with the relevant titles and descriptions in place.
4. Load Time Averages and Size
Google has long noted the size and the average load times of pages. This goes into the ranking algorithm by most counts and affects your position in the SERPs. This means you can have pretty good content on your website, but if the pages load slowly, Google is going to be wary of ranking you higher than other websites that load faster.
Google is all for user satisfaction. They want to show their users relevant results that are also easily accessible. If you have tons of javascript snippets, widgets, and other elements that slow down the load times, Google isn’t going to award you a high ranking.
5. Think Mobile, Think Responsive
This is one of the most hotly discussed topics in online marketing today. From mobile ads and local search to market trend in desktop/tablet consumption, it’s clear that moving toward a mobile-optimized site is the wave of the future.
When you think of a mobile/responsive website, how do you go about it? Responsive as in CSS media queries, or entirely new domains like “m.domain.com”? The former is recommended often because this keeps things in the same domain (link juice, no duplication, etc.). It keeps things simpler.
6. Authority & AuthorRank
The author-meta gets a new lease on life with Google promoting the AuthorRank metric. It’s a little more complex than that now, however. You will have to enable rich snippets for your website, make sure your Google+ profile is filled up, and link them up with your blog/website. AuthorRank has emerged as a very important and tangible metric that affects page rank, and is one of the on-page SEO tactics you should definitely do. Not only will it improve your rankings, but it will also improve your click-through rate in the SERPs.
7. Design Shouldn’t Be the Last Thing On Your List
Ironically, I had to write about this as the last thing because many people remember only the last thing they’ve read in an article. Hardcore SEO people regularly overlook the importance of design.
Aesthetics and readability stem directly from the design of a website. Google is good at figuring out what shows “above the fold” on websites, and Google explicitly recommends that you place content above the fold so your readers are treated to information rather than ads.
On-page SEO isn’t only about the meta code and the canonical URL. It’s about how your website connects to the user and to the robot. It’s about how you make sure your website is accessible and readable, and still has enough information under the hood for the search engines to pick up easily.

How to get faster indexing in Google, Yahoo and Live Search




How to get faster indexing in google,yahoo and live search?




  •     Submit an XML Sitemap
  •     clean navigation structure
  •     Get quality links
  •     Go hot on Digg and other social networking sites
  •     Make unique and helpful content
  •     Use social bookmarking
  •     Verify your sites with Google Webmaster Tools, Yahoo Site Explorer and Live Webmaster Tools
  •     Remove Canonical domain issues


Another Method to get faster indexing.

Re submit XML site map of your website in Google webmaster tool. Start social bookmarking for all URLs of your website, some of good social bookmarking sites are digg, stumbleupon, reddit, delicious etc..
You can do following activities for effective indexing:-
1. Join Facebook, MyBlogLog, Twitter
2. Add url to Google, Yahoo and MSN
3. Add url using Submit Express
4. Add url to FeedBurner, Google Webmaster
5. Link to other blogs

Friday 1 March 2013

SERP Title Changes 2013

Over the last few days, people have noticed that Google have started changing Titles in the SERPs in a new way.
Not only are they rewording the <title> to what they think is more suitable/relevant - they are now apparently placing the Brand at the start of the <title>.

<title>My page title here</title>
is being shown in the SERPs as if it was
<title>CompanyName : My page title here</title>

Now, changes to titles isn't anything new - google have been doing that (and the same to Descriptions) for years.
What may be different is that several reports include not only the title change, but apparently alterations to Ranking as well.

Here It is Example for the image: