8 Tips to Power Packed PDFs!

July 31, 2009
Sharon Housley of feedforall.com breaks down the important factors of PDF visibility into eight simple ‘to-dos.’ Keep these in mind when you publish your PDF to the web.

With my own experience of good, ‘ol fashioned trial-and-error, I also learned the following:

— Keeping text, especially Title Text and directional/decorated text, in its own PDF field is important. If you do not ‘field’ these items, you may end up with a jumbled mess of characters instead of a solid, spiderable, keyphrased title or header.

— Gain more keyphrase weight by reflecting/repeating keyphrases found on pages that link to your PDF and those pages that your PDF links to. (Only if this worked for HTML pages, too… Darn!)

— I don’t recommend archiving PDFs. Establish a URL and leave it!

— If you go through the trouble of publishing a PDF, at least get some credit for it by using in-copy linking with clean links.

— PDF Document Properties are also categorized and indexed. Not taking advantage of them is like not using alt text on your images.

— Use ‘outline’ formatting where possible to auto-generate heading tag formatting.

— Don’t publish a photo without having something relevant to say about it. Use event names, dates, branding, etc. Avoid: “Joe at the bar.” Use: “ABC’s Summer Picnic 2009: Joe’s Drinks.”

— If you think about file naming, logically, then picking off any minor visibility issues with ALL file names can only enhance spiderability.

If you have any tips of your own, leave a comment!

Chat Man

clipped from www.promotionworld.com

Search Engine Optimization for PDFs

by Sharon Housley
July 31, 2009

1. Text Based
the most important thing involves how the PDF file is created.
2. Optimize Text
Text contained in a PDF file is very similar to web copy
3. Link Depth
links to the PDF files that you wish to have indexed by
the search engines should be on a web page that is frequently spidered
by search engines.
4. Add Links
5. Document Properties
this
is done in Adobe Acrobat. Details: File -> Document Properties
Advanced – > Document Meta-data
6. Bold Heading
Use H1 or H2 tags to make headings and sub-titles bold.
7. Caption Under Photos
search engines will index the image
captions just as they would the “alt text” on a web page image
8. Name Of The PDF
Use related keywords in the actual file name for the PDF file.
Advertisements

Pages of Note

September 16, 2008
Terri Wells, over at SEOChat.com, posted a nice article regarding pages that you should have on your site. These pages, IMO, are some of those 1% SEO factors; not, necessarily, direct contributors to ‘good SEO,’ but good for the end-user and overall user experience. (Except the bit about custom 404s I mention later in this post) And, noticing the landscape, end-user experience inevitably carries weight, at some point.

I like the recommendation of the ‘about us’ page being developed as a kind of ‘home page’ to the area of a company’s site that detailing operations and business movements. This seems to me to be a great candidate for sub-domaining, to me.

Custom 404 pages, I think, might be weighted fairly heavy in the not-too-distant future, if the toolkit released by MSN in June 2008 or Google’s 404 widget release in late Aug 2008 are any kind of indication. When MSN and Google move toward the same goal, one should take proper note! Not only does a custom/dynamic 404 page clearly enhance overall user experience, but when a crawler gets to ‘something crazy’ when it was spidering ‘something relevant,’ you now have the ability, on a custom 404 page, to redirect the bot back to a 200 URL with relevant info. Happy bots = happy SEs.

Also, a very nice notable by Terri: “PRIVACY POLICY” Get one and post it. If you don’t plan on collecting info, declare it! If you do collect info, make a promise to your clients that you will respect their privacy. This helps to solidify an effective CRM program, too.

Robots.txt: Have one, even if it’s blank!

Sitemap: Be sure to have 2 versions: One for bots, and one for humans.

Good, general tips, but I think there’s reason to institute custom 404 pages.

Chat Man

clipped from www.seochat.com
Web Pages to Include in Your Site

By: Terri Wells
2008-09-15
Do you have an “About us”�page?
Now let’s go to your “Contact us” page.
clipped from www.seochat.com
you should have a special page that explains your site’s copyright rules.
Some content creators are beginning to embrace the idea of Creative Commons, reserving some but not all of their rights under copyright.
By the way, it makes sense to include trademark information on this page as well
clipped from www.seochat.com
you should have a custom 404 page.
The problem with a generic page is that it does nothing to help users find what they want.
Do you plan to collect any information at all from users of your web site? If you do, you need to tell them what you plan to do with that information. This means you need to build a page with your privacy policy.
Speaking of collecting sensitive information from your users, if you do that at all, you’re also going to need a secure page, and you’re going to need to document your security procedures.
clipped from www.seochat.com
how to use robots.txt files;
official Sitemap page
  blog it

Pagination: Problem Mostly Solved

August 29, 2008

Rand Fishkin, in a quite thorough video, explains that natural pagination (aside from duplicate content issues) reduces link juice as pages move from ‘readily prominent’ to ‘archived.’

In order to maintain effective link juice, he recommends implementing sub categories; which also addresses dup content issues to an extent. The idea here is that individual articles/posts can be aggregated to one relevant, topical page based on a specific sub-category. (Prior to seeing this video, I’d attempted a similar technique by using tags, as you can see on the right side-bar.) Not only does this present a natural hierarchy to spiders, navigation by human users becomes much more streamlined as users will be able to link or bookmark the specific sub-category that interests them most!

Check out Rand’s video: SEOmoz Whiteboard Friday – Solving Pagination Problems


My Competitors Have All the Good Ideas

August 27, 2008
Some of us, myself included, stroll on over to our competition to check things out. But, did you stop to realize that your competition is probably you’re greatest resource in striving for higher SERP rankings?

Lisa Barone writes a great article titled “Using Competitive Research to Find Content Ideas.”

Atta girl, Lisa!

clipped from www.bruceclay.com
One of the best ways to fill content holes on your Web site or blog is to do some competitive research to see what others in your industry are writing about. You want to see what they’re doing right, where they’re missing the mark, and what you could add to your site that they haven’t even thought of yet.
Doing competitive research can also be a good way to think up new tools, tricks or toys to add to your Web site to attract links. Often you’ll find that your competitors are writing confusing How-To articles that would make a much better video or have an article explaining the latest baby names that could easily be turned into a fun tool – take the initiative and make it. Users love interacting with fun content. You want to be continuously looking for creative ways to make yourself more interesting and more useful to your visitors.
So much of successful search engine optimization is about your ability to product useful content that users will be interested in.
blog it

Content is Still King

August 26, 2008
Mark’s advice is spot-on! Theme your content! Without a theme, visitors will be sporadic, at best. Consistency across your site helps visitors and search engines to understand your overall topic. 

Content can come from a variety of sources. Industry advice, audience tips, and current news are all great places to get inspiration for relevant, fresh content.

clipped from searchenginewatch.com

Mark Jackson

 

SEO Success — Guess What…Content Works!

 

 

Theming Content

  • Offer advice on your industry.
  • Offer tips to your audience.
  • Deliver industry news.
No successful SEO effort takes place without a foundation of quality content. So if you have a small 10-page Web site, you’ll likely need to create a content strategy before you realize the benefits of SEO.
  blog it

Multiple Sitemaps?

August 26, 2008
clipped from www.seochat.com

You Need More Than One Site Map
 

Table of Contents:
  • You Need More Than One Site Map
  • HTML Site Map to Help Human Visitors
  • ROR Site Maps for All Search Engines
  • Google and XML
  •   blog it
    Justin Pinkus at SEOChat.com posts his ideas on sitemaps and how each ‘audience’ of your site should have a sitemap specifically tailored toward that audience, including the Search Engines. 

    A few times in the post, Justin mentions broken links. You should have a link checker (I use GSite Crawler) of some kind and verify that your site’s links are functioning. Bad URLs can lead to bad rankings nearly immediately. I had an experience once where a directory page with relative URLs kept looping back to itself for about 30 of the on-page links. Yahoo! all but disappeared from my site until I got those links fixed.

    Justin also refers to another article of his that seems promising, given the info that he extracts for this article, but gives no link or location as to where this ‘other’ article is found. Awww, pooh!


    The Semantics of Understanding Semantics

    August 22, 2008
    Chris Boggs shared his insight into one of the ‘hot topics at SES San Jose’ regarding word/phrase co-occurrence. He goes on to source SEW Forum moderator, Dr. Edel Garcia, and Marcia Welter (Marcia’s post on Webmaster World Forums is quite lively and insightful). Also addressed are spammers and how successful email spam with auto-generated content slips past filters; possibly giving spammers a glean into how to get past Search filters. Then, he poses the ‘if all things are equal’ question regarding co-occurrence.   

    Commentary note: But, if Google is only aware of about 30% of the web, are search crawlers still effective? Or will Web 2.0, essentially, become ‘self-indexing;’ policed by a variety of humans, 24-7, 365 days a year? Shoot me your thoughts!

    clipped from searchenginewatch.com
      

    Word or Phrase Co-Occurrence within Particular Industries
    If two documents have the same number of links pointed to them, and have relatively equal content and trust value, could the addition of particular words that are more likely to occur in topical content be the difference-maker that drives a higher ranking?
    I’ve had conversations recently with Mike Grehan about the Web and how the crawler is no longer effective. He points out that Google only sees about 30 percent of all new content created each day.  

     

    Combine that with the comment Pat Sexton made this week that well over 10 million people have found Facebook’s SuperPoke! app without using Google. And you can start to see a way of communicating information that doesn’t rely on a search engine.

      blog it