Showing posts with label seo. Show all posts
Showing posts with label seo. Show all posts

Tuesday, June 16, 2015

Why "Googling Yourself" is not an excuse to ignore keyword rank tracking for SEO

I recently was asked my opinion on some SEO advice that had been shared with a client. The advice was essentially: Ignore keyword rank tracking because all Google results are personalized and you can't accurately track rankings by "Googling Yourself". Instead, look to organic traffic in Google Analytics as the sole indicator of SEO success.

First off, "Googling Yourself" seems like a bit of a misnomer for the act of checking Google ranks by Google searching a keyword phrase yourself. I think of Googling my name when I read that but it is evident what they mean when read in context. Ironically, one of the biggest issues I have with this has to do with context itself... read on to see what I mean.

There are a few things right about this:

  1. Keyword rank tracking can be considered unreliable, since search results are personalized.
  2. "Googling Yourself" can provide misleading information on keyword rankings since your results are personalized.
  3. Organic traffic in Analytics is a great indicator of overall SEO success.


Beyond those, this advice falls very short of effective SEO but there are ways to reliably check keyword rankings.

While "Googling Yourself" is not a reliable way to check rankings and Google results are indeed personalized, it doesn't mean keyword rankings don't matter for SEO. This is why we built our own in-house keyword rank checking tool that uses Google's Search API to determine ranks. The Search API doesn't factor any personalization in, but it also isn't an exact mirror of the Google.com search engine. Even so, we believe it to provide an authoritative baseline and the ranks returned are the most reliable ranks we can get.

Keyword rankings and activity are important.

Knowing how users found your website, specific what keyword terms were used, is very beneficial. This lets you know what is working, what isn't working, and sheds light on user perception and intent. In the past, Google Analytics reported on the keywords that people used to reach your website. This has since been all but stopped by Google (which is an entirely separate topic). However, you can get some referral keyword insight through Google Webmaster Tools. It isn't ideal, but it does offer some insight.

That said, knowing what keyword searches led to visits is a passive exercise. Keyword research and targeting is an active exercise. Why discount this just because it is difficult to asses rank? Why accept that overall organic traffic patterns are enough to indicate SEO success?

Being pro-active about SEO is better than watching organic traffic numbers passively.

It is true that the best overall indicator of how well you are doing on SEO is organic traffic. That is the end goal in most every case: more traffic from the search engines, period. The more organic traffic, the better you are doing, but how does this help you to be pro-active about SEO? It simply doesn't. It doesn't even help you to be reactive, because you have no insights into why your organic traffic is what it is. In this scenario, you simply implement some best practices (hopefully), hope for Google's favor, and anxiously wait for your organic traffic to increase. Google is smart enough that this is actually still pretty effective, but it is missing a huge piece of the equation... keyword ranking tracking.

Without keyword research and tracking, you can't know what keywords are competitive or low-hanging fruit, effective or ineffective. With keyword research and tracking, you can come up with a plan to target specific phrases. This is where you can get an edge on rankings and pro-actively increase your organic traffic. Google wants to provide the best results to its users, but it needs the help of websites to make that possible. If your website cooperates by offering up clean and clear indications of context, you are helping Google identify what your website is all about. You are also laser-focusing your content toward that context, which helps build authority in Google's eyes. You can partly accomplish this laser-focus through a natural but focused application of keyword phrases in all the right places (headings, lists, titles, descriptions, etc). Do not mistake this for keyword stuffing, but approach it as naturally editing your copy to provide consistency and focus.

In summary:

Search rankings should not be discounted or ignored. Checking ranks on specific search terms (via a tool like ours) isn't going to provide an accurate view of your overall organic search performance, only a sub-set of specific terms you determine worthy of tracking. However, those specific keyword rankings do effect the personalized search results everyone sees. If you rank high on highly relevant keyword searches, your chances of ranking high in related personalized results is much higher. Rankings also offer invaluable insight into how well Google associates you with what you believe the context of your website to be. If you are going after a specific niche, targeting specific keyword phrases and monitoring their performance is a huge part of reaching that niche. It all helps Google understand context (who you are) and authority (what you know or have to offer).

Client example:

Passive (Keyword targeting strategy NOT in place): 1035 organic visitors (12/2014)
Pro-Active (Keyword targeting strategy in place): 1723 organic visitors. (2/2015)
The December sample, shows how the website was doing on organic traffic without keyword research and targeting, but the February sample shows how the website was doing after keyword research and targeting was in place for only a few weeks.

There is too much potential benefit to ignore keyword rankings simply because Google personalizes results.

Sunday, April 13, 2014

Forcing Google to NOT Crawl and Index Mobile Site

Let's assume you've got a desktop site and a mobile version on different subdomains or subdirectories. Google might actually index your mobile site's URLs and serve them up in desktop search results... yes, I have been a victim of this and seen it myself (see screenshot below). It is surprising, but Google can't get this right 100% of the time.
Desktop search result with mobile URL in Google index
So what do you do?

You have two main options to force Google desktop to crawl your desktop site and Google mobile to crawl your mobile site. Setup Webmaster Tools to specify this setup and/or modify your robots.txt. The Webmaster Tools option pretty much consists of registering both sites in Webmaster Tools and setting up distinct sitemaps. This still leaves a lot to chance assuming Google will take these directions when indexing your URLs. Also, this only works if your mobile site is on a different subdomain. If your mobile site resides in a subdirectory, like http://www.gtautomax.com's mobile site (http://www.gtautomax.com/mobile/), then you are left with robots.txt modification as your main option.

The strategy is to direct Googlebot and other desktop search bots away from the mobile site and allow Googlebot-Mobile and other mobile search bots to access the mobile site. (This assumes your site is already redirecting users from desktop URLs to mobile URLs automatically.)


If you have different desktop/mobile subdomains, your robots.txt will look something like this:

Desktop Site:
User-agent: Googlebot
User-agent: Slurp
User-agent: bingbot
Allow: /

User-agent: Googlebot-Mobile
User-Agent: YahooSeeker/M1A1-R2D2
User-Agent: MSNBOT_Mobile
Disallow: /

Mobile Site:
User-agent: Googlebot
User-agent: Slurp
User-agent: bingbot
Disallow: /

User-agent: Googlebot-Mobile
User-Agent: YahooSeeker/M1A1-R2D2
User-Agent: MSNBOT_Mobile
Allow: /


If your desktop/mobile sites are differentiated by subdirectory, then your robots.txt will look something like this:

Desktop Site:
User-agent: Googlebot
User-agent: Slurp
User-agent: bingbot
Disallow: /mobile/

Mobile Site:
User-agent: Googlebot-Mobile
User-Agent: YahooSeeker/M1A1-R2D2
User-Agent: MSNBOT_Mobile
Allow: /mobile/


As always, make sure and check your robots.txt file in Webmaster Tools for any errors and to ensure mobile URLs and desktop URLs are handled properly.

Thursday, November 8, 2012

Google's Disavow Tool: Long awaited and quite ominous

It is no secret that many webmasters and SEOs have long awaited Google's backlink disavow tool. After Panda and Penguin, bad backlinks have become toxic to a website's SEO efforts. Now that Google is allowing webmasters to disavow bad backlinks, the problem is solved right?

Wrong. As typical, as one problem is solved another is created. Let's take a look at an all too real hypothetical example.

Website A has 1,000 backlinks and took a big hit in rankings after the Panda and Penguin updates. The webmaster doesn't know what to do. The webmaster calls the SEO who is in a state of panic and buried with emails and phone calls. Why? Because every one of the SEO's clients is having an SEO meltdown. Giving up on the SEO, the webmaster decides to research why the big drop in rankings occurred... the verdict: bad backlinks. The webmaster now starts hunting down bad backlinks - any links with unnatural anchor text, spammy content, biased one-sided content links, pretty much anything that wasn't generated by an actual fan or customer. What to do with all these links? Well the webmaster spent so long building them, it is now time to start removing them, but the links are on sites out of the webmaster's control. Google finally releases the disavow tool, allowing the webmaster to indicate which links are bad and should not be counted in Google's search ranking algorithm. The webmaster kicks back and starts generating quality content... it is all over. Or is it?

Lets say that 400 of the 1000 backlinks were disavowed by the webmaster. Well now the webmaster has effectively reported around 400 websites to Google as having no value. The webmaster's friends have done the same. Now that the webmaster is generating real quality content, there is no place to put it. All the webmaster's favorite article and blog posting sites have been mass-reported to Google via the disavow tool and hold little value now.

What is the answer? Generate content worth sharing. Share it yourself on social networks (embrace Google+) and encourage your fans and customers to share the content as well. Participate in conversations on social sites and forums and offer real valuable content.

YOUR CONTENT IS YOUR NEW PRODUCT. IT IS YOUR NEW "LINK-JUICE"
Generate good quality content regularly and offer value first and foremost. This can be easier than it may seem once you have established a good methodology for it. Schedules, ordered lists, and brainstorming once a week can provide enough direction for a month of good content. I will be glad to share my tips on content generation, so keep check for that. Be sure to share this content and check back for more updates and tips.