Maximizing Your Site’s Long Tail Keyword Traffic

August 11, 2009

Long-tail keywords are the entries to organic SERP traffic.

And, it lays the groundwork for your site’s relevancy. Be sure to maximize where you can!

Maximizing Your Site`s Long Tail Keyword Traffic

Rating: 4 stars / 3


Table of Contents:

# Maximizing Your Site`s Long Tail Keyword Traffic

# The analysis and power of long tail traffic

# The game of profit: competitive versus long tail terms

# A deep look at your content: how do you write for long tail terms?

via Maximizing Your Site`s Long Tail Keyword Traffic.


Google Webmaster’s Tools: Top 5 Uses

August 11, 2009 posts a decent reminder in importance and prioritizing. Using Google Webmaster Tools will only help if you get data that tells you what to do, though. As with all tools, use the data incontext and not as dictation. 1% SEO for the little things, 99% SEO for everything else.

Chat Man

The Five Most Important Things You Can Do with Google Webmaster Tools

Rating: 5 stars / 3


Table of Contents:

# The Five Most Important Things You Can Do with Google Webmaster Tools

# Finish Verifying the Site

# Second and Third Important GWT Tasks

# Fourth and Fifth Important GWT Tasks

via The Five Most Important Things You Can Do with Google Webmaster Tools.

Real Time Developments in Real Time….

August 11, 2009

When MySpace developed aspects of FaceBook, then FB bought friend feed; and now there’s the idea that Google is going to buy Twitter while Bing rewrites the landscape of competition….

There’s a new sheriff in town – and we don’t know if it’s social or not…. But, if it is, you can find it with the new FaceBook feature of real time searching.

Social Networking

Facebook Rolls Out Real-Time Search

So according to the Facebook blog, we can now search up to the last 30 days of our Facebook News Feed containing status updates, photos, links, videos and notes shared by our Facebook friends.

via Facebook Rolls Out Real-Time Search.

Caffeine Update – Moogle Insights

August 11, 2009

Matt posts a quickie to inform the masses that the Caffeine Update is not, directly, a SERP change, but more of an update to the chassis of Google Work.

Although, there may be some SERP changes; so you should check ’em out at

Matt Cutts: Gadgets, Google, and SEO

More info on the Caffeine Update

August 10, 2009

in Google/SEO

via More info on the Caffeine Update.

Have You Bing Crawled?

August 11, 2009

A post on the Live blog goes into discussion of the MSNBot and crawl delay.

My professional opinion leans toward not setting a crawl rate, at all. SEs will gauge the ‘crawlability’ of your site and will crawl at a frequency that is best suited for that site.

However, I do find it rather useful to have dynamic pages, such as news, blogs, and the like, to a more frequent setting so that the SEs are aware of and can find new, fresh content.

Crawl delay and the Bing crawler, MSNBot

Should you set a crawl delay?

Many factors affect the crawling of a site, including (but not limited to):

* The total number of pages on a site (is the site small, large, or somewhere in-between?)

* The size of the content (PDFs and Microsoft Office files are typically much larger than regular HTML files)

* The freshness of the content (how often is content added/removed/changed?)

* The number of allowed concurrent connections (a function of the web server infrastructure)

* The bandwidth of the site (a function of the host’s service provider; the lower the bandwidth, the lower the server’s capacity to serve page requests)

* How highly does the site rank (content judged as not relevant won’t be crawled as often as highly relevant content)

The rate at which a site is crawled is an amalgam of all of those factors and more.

via Bing – Crawl delay and the Bing crawler, MSNBot – Webmaster Blog – Bing Community.

Phishing for Demographics

August 10, 2009

As a visibility expert that is focused, primarily, on results, then knowing that you’re design efforts are ranking well across ALL engines is pretty important.

For a few individuals, single-engine optimization is the holy grail. With the development of Bing and the inevitable ‘others to come,’ it isn’t the wisest choice to over-optimize for a singe engine. The simple fact is that REAL SEO is universal, allowing for dominating rankings across all engines.

A tool that I’m sure I’ll use a fair amount is ‘blind search‘ – just to gauge how a traffic ranks across the big three without having to open tabs and stuff. I use dogpile to gauge overall rank, but to see all three SERPs, clutter-free, is pretty nice; especially because there’s no need to install any add-ons – it’s just a search page.

But, this has got to be some interesting search data that’s coming across… After a year, a decent trend curve could be shown over a myriad of markets, all based on live, direct searching – WITH VOTING TO BOOT!! I wonder what, if any, data Mr. Kordahi might be interested in sharing….

Blind Search: Side By Side Search Results For Google, Yahoo & Bing

2009 August 9

by MHB

A employee of Microsoft, Michael Kordahi, has launched a “fun experiment”, side project to see how the three major search engines return results for the same keyword.

He calls the service, “BlindSearch, the search engine taste test.”

“”The goal of this site is simple, we want to see what happens when you remove the branding from search engines. How differently will you perceive the results?”

The columns are randomized with every query, so Google isn’t always in the first, second or third position.

Obviously, this works best for a site you own, to see how it ranks side by side on the three major search engines, Google, Yahoo and Bing.

I did this for “the domains”, “luxury bedding” and “discount bedding” and the returned results for each of the three search engines, did match what I found doing a direct search.

In some cases Google was the best, returning the domain higher and sometimes Bing returned the better result.

Try it for yourself.

Michael makes the following disclaimer on this site:

“The system has many flaws that I know about already, the primary one of interest is the lack of localisation. So, all searches are going through the US as US searches. The other deficiency worth noting is that there is much missing from the actual experience of using these search engines eg, image thumbnails, suggestions, refine queries etc.”

via Blind Search: Side By Side Search Results For Google, Yahoo & Bing | The Domains.

1% SEO: The Little Things Add Up

July 31, 2009

While working for a client of mine with a large site, I was constantly expected to perform ‘miracles’ of SEO. Time and time, again, I tried to explain that achieving top rank for his home page was going to be difficult as long as dead links, broken code, bad layout, and other seemingly insignificant factors went unaddressed.

Needless to say, although I was able to optimize for commanding ranking across Bing, Dogpile, Ask, and Google on 2nd tier keyphrases, the top tier phrases that the client really wanted stayed just beyond reach. And never once did I complain. I just reiterated that once you have 90% of the SEO done, there are still 1% factors that need to be addressed – and when you’re going after firecely targeted keyphrases, the luxury of ‘but the competition doesn’t seem to have those problems’ as a point of insight is no longer relevant, really.

When minor tech issues are the only problems you can find, you’re much better off just correcting them.

So, even though Bruce seems to be reiterating obvious points that should be considered and optimized, they are points that all too many feel won’t matter in the long run…. They might be points that don’t make *the* difference, but they do have their place.

Chat Man – SEO is in the Details: Bruce Busts the Boondoggle – SEO Blog.

HTML Constructs

It used to be that a site needed every Meta tag in order to rank. Now that’s not the case. No one element will cause a site to rank, but it’s crucial to remember that SEO is effective as a whole.

Make your site all that it can be by using HTML as it was intended.

XML Sitemaps

an XML file often causes pages to be added to the index, which is an important objective of search engine optimization. Also, consider the use of an XML Sitemap when redirecting one site to another.

Keywords in URLs and Image Links

Keyword-rich URLs add value to a site in more than one way.

Page Copy

There is no magic [keyword] density, but there are winning page footprints. density alone is nearly useless, but it does have its place.