Some Guidelines on Sitemaps and Site Feeds for Search Engines

It is vital for the website owners to incorporate all the basic SEO techniques in the website. This enhances the search engine ranking. In this connection, it is important to gauge and understand the significance of site maps in websites.  The HTML site maps are actually on the landing page of a website where various other pages with links in the website are displayed. The HTML site map guarantees the objective of getting more benefits and profits through search engine optimization. .


Webmasters are often plagued by the thought of conceptualizing a website that is search engine friendly. In spite of putting in their best efforts, the web pages do not appear in the search engine results page despite directly searching for terms that should have yielded the website. This is an indication that the website is not search engine friendly. To create site maps and site feeds for search engine compatibility, certain guidelines should be taking into account like :

01)      Adding text to images, Flash and videos is imperative as search engine software can only “read” text. It does not imply that the text embedded in a image, Flash file or video are ignored rather it suggests that ordinary articles are more favored. Only Google has the facility of reading Flash files but none of the other search engines are yet capable of viewing an image file or video to determine the text that it contains. Each picture should be described in the “Alt .txt” when images are put in the web page using HTML codes.

02)      Validating the HTML code ensures that HTML coding do not have errors. It allows the web browsers to format the web page according to the desire of the web client., while allowing the search engines to know which portions of web page to index.

03)      Creating relevant title tags guarantee search engines giving more weightage to text appearing in the HTML title tag of the web page. Search engines use the tag as part of its algorithm to determine the content of the page.

04)      Using straight HTML navigation links on the website allows the search engines to locate the web pages of a site. Java Script generated links not be indexed by search engines. Similarly, links embedded in Flash files, too, can not be “read”. Websites totally reliant on such links are at a disadvantage compared to websites that use HTML links. Adding a site map to the website with links to the site map from the main page would facilitate search engines and human visitors to maneuver around the site.

05)      Elimination of apparent Content Duplication where the web pages of different websites are identical. Link dilution often results in such occurrences and to avoid such occurrences, the website owner should be reached under the URLs. Using the robots .txt file may be the apt solution.

06)      Removal of hidden text as it is counter productive where the text in the main body of the web page is not displayed on the screen when web visitors view the page. Using a free web host may cause a website to contain hidden texts. The best solution is to obtain a domain name and placing the site in a commercial web host.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *