In order for Google to crawl your site efficiently and effectively, remember these 6 rules:
1. Avoid Content Duplication
Google tries hard to index and show pages with distinct information. This filtering means, for instance, that if your site has articles in “regular” and “printer” versions and neither set is blocked in robots.txt nor via a no-index meta tag, Google will choose one version to list. If Google perceives that duplicate content may be shown with intent to manipulate rankings and deceive users, they may also make appropriate adjustments in the indexing and ranking of the sites involved.
Be sure and use Google "webmaster tools" to see what Google sees when crawling your site and determine what duplicate issues can be easily fixed.
2. Remove User-Specific Details From URLs.
If you have URL parameters that don’t change the content of the page-like session IDs or sort order - these can be removed from the URL. You can use a cookie to hold this instead.
3. Disallow Actions Googlebot Can’t Perform
You can disallow crawling of shopping carts, landing pages, contact forms, and other pages containing calls to action that a crawler can’t perform.
Learn about these access files or make sure whomever is helping you with your site has this taken care of.
4. Optimize Dynamic URLs
You will recognize Dynamic URLs, from the fact that they contain a question mark.
Search engines have problems creating links to dynamic content. Where appropriate, use static URLs to reference dynamic content. Otherwise, try to ensure your dynamic URL is linked to by content referenced by static URLs. There are also paid-inclusion programs that you can use to assist with this.
5. Rein In Infinite Spaces
Do you have a calendar that links to an infinite number of past or future dates? Our websites seldom have calendar of events using Google's own Calendar which we wrapper into our CMS sites. This is an infinite crawl space on your website, and crawlers could be wasting their and your bandwidth trying to crawl it all.
6. How To Get Your Preferred Urls Indexed
* Set your preferred domain in Google’s Webmaster Tools (www.deluxeagent.com vs. deluxeagent.com)
* Put canonical URLs in your Sitemap
* Use the new rel=”canonical” on any duplicate URLs
Example: <link rel=”canonical” href=”www.deluxeagent.com/portfolio.asp”/>
If you are not sure how to do any of this, please contact a professional web team like ours to help you with it.
If there is one thing I could not stand, it would be seeing Agents spend so much money on websites that are not effective and helping their business because they are not developed well for today's web solution needs and E-marketing.