Erin Kilgour, Staff Writer, 9-14-2009
Submit web page search engine
It is important for every web master to submit web page search engine to be listed in the results of a search engine. The rules and paths followed by all search engines are almost similar. Hence following the guidelines of Google provided for webmasters can help you submit your site in almost any search engine. This will help search engines find, index as well as rank your site favorably. Following the guidelines will also help you avoid some practices that are considered illicit by most search engines and which may lead to the removal of the site or penalization by the search engines, especially Google. Penalized sites will no longer show on the search results of any search engine. The guidelines are grossly divided into designs and content guidelines, technical guidelines and quality guidelines. Once your site ready as per the guideline, submit your URL to Google through http://www.Google.com/addURL.html. When you submit web URL you also need to submit the sitemap. Google webmaster tools guide will be quiet helpful for you in this regard. The reason for sitemap submission is because Google uses the sitemap to learn about your website’ structure and also increase the coverage of your site’s individual web pages. If your site has external links or if you have any external link in your site, after you submit your URL to Google ensure that they are aware that your website has been commissioned and it is online. As otherwise if the sites are not linked properly or working properly Google during update process may consider then as dead links and remove it from the index.
Before you submit web page search engine there are certain design and content guidelines to be followed for better acceptance level. The hierarchy and text links in your site should be clear and each page should be accessible from at least from one static text link. Provide a site map with links so that your users can know which are the important points or parts on your site. If your site is too big and the sitemap consists more than 100 links it would be wise to break it up into separate pages. Your site should be useful and rich in information with a good number of pages that describe your content clearly and accurately. You will also need to keep it keyword rich, for this think of all the words your users may use to type and find the sites of your kind. Include all these words into your content rich articles. As far as possible use text in place of images or animations especially the most important names, links and content. The crawlers of most engines do not recognize the text present in images and if these are pertaining to the keywords you can miss out on vital search results. If the use of images cannot be avoided, use “A:\LT” attribute and describe the content in brief text. When you submit your URL to Google place accurate and descriptive title elements along with ALT attributes. Also check and correct broken links and HTML tags. Sites that are rich in dynamic pages may not be considered favorably by search engine crawlers hence keep the parameters short and fewer numbers of such dynamic pages. Static pages have better chances of visibility by search engine crawlers. Links provided on any given page should be of a reasonable number, lesser than a hundred links preferably. There are separate guidelines for images that should be followed when publishing images.
Although you submit web URL to search engines, you can also submit to web directory. When you submit to web directory you get the advantage of targeting niche market browsers. Since the web directory do not emphasize on keywords but rather on categories you can keep the content on your site rich and informative and submit to web directory in at least three categories. Web directory do not display individual web pages but the link of your site directly under a category.
So, it would be wise to take note of some technical guidelines when planning your website.
- Examine your site with text browsers like Lynx.
Ensure your web server backs the If-Modified-Since HTTP header. This allows your web server to tell the search engine when your content has changed since last crawled and indexed. By such supporting feature saves you on bandwidth and overhead expense. Most web servers provide robots.txt file, make use of this as it tells the search engine crawlers which directories to crawl and which to avoid. If you happen to use a content management system, you will need to ensure that the system produces pages and links that can be easily crawled by search engines when you submit web URL. You can also use the robots.txt to exclude the pages and other auto generated web pages that do not hold much value to your users and prevent them from being displayed in the search results page. The final step would be to make sure that your sites loads in different browsers and appears correctly as various users use various browsers and if any single type of browser does not support your site type, you may lose out on a potential customer.
The third type of guidelines to be followed when you submit web page search engine are the quality guidelines. These cover the most common types of deceptive and manipulative behavior to which the search engines respond negatively. Apart from following the guidelines specified in the Google guidelines page it would be unwise to assume that if a specific deceptive technique has not been mentioned you can use it or that search engines approve of it. The singular aim of both the search engines and the webmaster is to provide the user with better browsing experience, better sales conversion rates and hence they should design and launch a website