Google Webmaster’s Tools: Top 5 Uses

August 11, 2009

SEOchat.com posts a decent reminder in importance and prioritizing. Using Google Webmaster Tools will only help if you get data that tells you what to do, though. As with all tools, use the data incontext and not as dictation. 1% SEO for the little things, 99% SEO for everything else.

Chat Man

The Five Most Important Things You Can Do with Google Webmaster Tools

Rating: 5 stars / 3

2009-08-11

Table of Contents:

# The Five Most Important Things You Can Do with Google Webmaster Tools

# Finish Verifying the Site

# Second and Third Important GWT Tasks

# Fourth and Fifth Important GWT Tasks

via The Five Most Important Things You Can Do with Google Webmaster Tools.


Caffeine Update – Moogle Insights

August 11, 2009

Matt posts a quickie to inform the masses that the Caffeine Update is not, directly, a SERP change, but more of an update to the chassis of Google Work.

Although, there may be some SERP changes; so you should check ’em out at http://www2.sandbox.google.com/.

Matt Cutts: Gadgets, Google, and SEO

More info on the Caffeine Update

August 10, 2009

in Google/SEO

via More info on the Caffeine Update.


Have You Bing Crawled?

August 11, 2009

A post on the Live blog goes into discussion of the MSNBot and crawl delay.

My professional opinion leans toward not setting a crawl rate, at all. SEs will gauge the ‘crawlability’ of your site and will crawl at a frequency that is best suited for that site.

However, I do find it rather useful to have dynamic pages, such as news, blogs, and the like, to a more frequent setting so that the SEs are aware of and can find new, fresh content.

Crawl delay and the Bing crawler, MSNBot

Should you set a crawl delay?

Many factors affect the crawling of a site, including (but not limited to):

* The total number of pages on a site (is the site small, large, or somewhere in-between?)

* The size of the content (PDFs and Microsoft Office files are typically much larger than regular HTML files)

* The freshness of the content (how often is content added/removed/changed?)

* The number of allowed concurrent connections (a function of the web server infrastructure)

* The bandwidth of the site (a function of the host’s service provider; the lower the bandwidth, the lower the server’s capacity to serve page requests)

* How highly does the site rank (content judged as not relevant won’t be crawled as often as highly relevant content)

The rate at which a site is crawled is an amalgam of all of those factors and more.

via Bing – Crawl delay and the Bing crawler, MSNBot – Webmaster Blog – Bing Community.


Gleaned Insight from Google Patents in 2007

July 31, 2009
Tedster over at webmasterworld.com made some pretty astute observations of Google’s 2007 patent: Document Scoring Based on Link-Based Criteria.

The post discusses how often certain ‘areas’ of the page, such as footers, switch-out links to better detect paid linking. Many sites have relegated paid linking to footers and other segments of the page that seem natural.

The ambiguity of whether or not a page’s content should be updated on a regular basis is touched on, but is left just as ambiguous as ever. I think it’s common sense: If you have a news page, it should change quite frequently. If you have a statistics page, it should change as often as stats are measured. If you have a resource/reference page, then it should be updated ONLY as often as authoritative information changes.

I’m, personally, delighted to see that ‘partial indexing of pages’ is being given serious focus, if only briefly.

We all know, and have for some time, that the more inter-related terms a page ranks well for gives it better ranking for these terms, across-the-board.

I do, however, find that what is discussed about ranking ceiling, traffic throttling, and the yo-yo effect has some solid basis – as opposed to just giving SERP watchers something to complain about.

Enjoy!

Chat Man

clipped from www.webmasterworld.com
Google’s Patent on Backlinks – many interesting clues from 2007
tedster
With the current update apparently doing “something different” with backlinks, I went back for another reading of the 2007 patent application Document Scoring Based On Link-Based Criteria
PAGE SEGMENTATION and RATES OF CHANGE
Not only the back link juice itself is weighted differently, whether it changes is also given a different weight, depending on where the link appears on the page
PAGE CHANGES CAN IMPROVE OR LOWER RANKINGS
…it all depends on the query terms!
PARTIAL INDEXING OF PAGES
search engine may store “signatures” of documents instead of the (entire) documents themselves to detect changes to document content.
RANKING FOR SEVERAL SEARCHES
RANKING CEILINGS, TRAFFIC THROTTLING and the YO-YO EFFECT
Google may allow a ranking to grow only at a certain rate, or apply a certain maximum threshold of growth for defined period of time.

8 Tips to Power Packed PDFs!

July 31, 2009
Sharon Housley of feedforall.com breaks down the important factors of PDF visibility into eight simple ‘to-dos.’ Keep these in mind when you publish your PDF to the web.

With my own experience of good, ‘ol fashioned trial-and-error, I also learned the following:

— Keeping text, especially Title Text and directional/decorated text, in its own PDF field is important. If you do not ‘field’ these items, you may end up with a jumbled mess of characters instead of a solid, spiderable, keyphrased title or header.

— Gain more keyphrase weight by reflecting/repeating keyphrases found on pages that link to your PDF and those pages that your PDF links to. (Only if this worked for HTML pages, too… Darn!)

— I don’t recommend archiving PDFs. Establish a URL and leave it!

— If you go through the trouble of publishing a PDF, at least get some credit for it by using in-copy linking with clean links.

— PDF Document Properties are also categorized and indexed. Not taking advantage of them is like not using alt text on your images.

— Use ‘outline’ formatting where possible to auto-generate heading tag formatting.

— Don’t publish a photo without having something relevant to say about it. Use event names, dates, branding, etc. Avoid: “Joe at the bar.” Use: “ABC’s Summer Picnic 2009: Joe’s Drinks.”

— If you think about file naming, logically, then picking off any minor visibility issues with ALL file names can only enhance spiderability.

If you have any tips of your own, leave a comment!

Chat Man

clipped from www.promotionworld.com

Search Engine Optimization for PDFs

by Sharon Housley
July 31, 2009

1. Text Based
the most important thing involves how the PDF file is created.
2. Optimize Text
Text contained in a PDF file is very similar to web copy
3. Link Depth
links to the PDF files that you wish to have indexed by
the search engines should be on a web page that is frequently spidered
by search engines.
4. Add Links
5. Document Properties
this
is done in Adobe Acrobat. Details: File -> Document Properties
Advanced – > Document Meta-data
6. Bold Heading
Use H1 or H2 tags to make headings and sub-titles bold.
7. Caption Under Photos
search engines will index the image
captions just as they would the “alt text” on a web page image
8. Name Of The PDF
Use related keywords in the actual file name for the PDF file.

I said this a year ago….

July 31, 2009
Almost a year ago, exactly, I made comment that the landscape was changing. Now, with Yahoo! and Bing teaming up, it seems that the front-runner, Google, will no longer be the only big, bad dog to put up a fight….
clipped from seochatter.wordpress.com

I’ve noticed a severe and drastic change in the landscape of search from just a month ago. Even if you’ve really been watching, you’ll be surprised to know that, in the past 3 weeks:

Google data scrapers stopped working – SERP structure changes broke many SEO tools
Google ChromeThe omnibox All indicators are that Google has not monopolized its browser
Yahoo! BOSS – A whole new ability to “foster innovation in the search landscape”
Yahoo! SiteExplorer new features
Weather Report: Yahoo! Search Index Update
Yahoo! SearchMonkey – Yahoo!’s new developer platform that uses data web standards and structured data to enhance the usefulness of search results
Live Search Webmaster Tools – All new look at Live’s indexes
IE 8 – All indicators are that Micro$oft has not monopolized itsbrowser

For that much activity across all search engines, within such a short period of time, one could only reason that competition is getting quite fierce!

  blog it

How to Guarantee Noticeablity

September 18, 2008
Seth’s newest post has encouraged me to beef up the About Chat Man section of SEO Chatter. The plain and simple fact of the matter is that I am doing it differently: I want potential SEO clients to know what to ask their clients and what kinds of answers should be expected. 

Many thanks, Seth, for reminding me to remember *why* I do the things I do and undertake the projects I start.

clipped from sethgodin.typepad.com

But you’re not saying anything

big companies didn’t want the logo to be part of their story, they just wanted it to fit in with all the other big company logos
same thing goes on with pricing. If you price your products like the competition does, you’re not saying anything with your pricing. “Move along, there’s nothing to see here.”
Marketing storytelling is not about doing everything differently. You do many things the same, intentionally, because those ‘same things’ aren’t part of your story. It’s the different stuff where you will be noticed, and the different stuff where you tell your story.
If you’re not telling a story with some aspect of your marketing choices, then make sure that aspect is exactly what people expect. To do otherwise is to create random noise, not to further your marketing.
  blog it