How do the search engines find web sites? Do they wait for you to submit your site? Not any more. Usually search engines index a site whether it is formally submitted or not. Search engines use automated software robots called crawlers or spiders that move through the world wide web following the hyperlinks, from page to page and site to site, recording what they find as they go. They do this continually, checking back with sites already listed to see if they still exist, to find new pages, and to check for new content.
This means that if your web site is linked to from another web site that is already indexed, your site will be indexed on the next pass of the robot! If all of your web site pages are linked properly to each other, they will be indexed as well. In addition, many search engines actually share the same data! So if you get listed on one of them you will eventually be listed on the rest.
The only time you will need to submit your new site is if there is no link from any other web site to yours. At Back2Front, we take care of this for you; however, if you would like to link your site, you can submit your URL here: www.google.ca/addurl/
If you are in a niche market that has its own search engines, like genealogical web sites for example, it is worth submitting your site to specialist search engines, which may not share data with the larger, more general search engines.
So what about the SEO companies that will submit your site to thousands of search engines every month for a year for $375? This service is not only unnecessary, but also may be harmful to your search engine ranking. These companies use robots to submit your site to a list of search engines repeatedly. Most search engine software can detect excessive automated submissions and may actually penalize sites that are submitted multiple times and rank them lower or even black list them.
Of course getting indexed is only half the battle. You need to be listed, but you also need to be well ranked. Ranking a web site is a way of assigning it priority over other sites with similar content. The exact ranking algorithm used by search engine companies is a closely guarded secret. But they do make public the criteria that the calculations are based on and provide basic guidelines to web site designers for successful search engine optimization.
Guidelines for successful search engine optimization: The site must have relevant content to the key words used in the search. The closer the match to the exact key words, the better. The more often, the most emphasized, and the higher on the page the search term appears, the better. The more incoming links from other related web sites, the higher the ranking. The bigger (the more content, the most pages) the higher the ranking. The more current, most often updated, the higher the ranking. The older (longest uninterrupted active time span), the better. It is much more effective in the long run to make your web site as relevant and useful to the people who will search for your product or service as possible than to try and trick the search engines into ranking your web site higher than it should be. In fact, several practices commonly employed to fool search engines now have negative effects. Search engine companies actively research such techniques and developed ways to counter them in the interests of maintaining the accuracy and integrity of their search results.
So there you go, there really are no tricks, the rules are straightforward – no more mystery. If your site takes into account all of the above, your site will be well optimized for search engines.
However, there are some Roadblocks to Search Engines that will prevent the search engines from crawling and indexing your site properly that you need to make sure your web site does not have. I will deal with these in the next instalment of Search Engines Demystified.. stay tuned.
by Candace Carter, Back2Front-The Web Site People, 2009