Every Search Engine Robot Needs Validation


Your website is ready. Your content is in place, you have optimized your pages. What is the last thing you should do before uploading your hard work? Validate. It is surprising how many people do not validate the source code of their web pages before putting them online.

Search engine robots are automated programs that traverse the web, indexing page content and following links. Robots are basic, and robots are definitely not smart. Robots have the functionality of early generation browsers: they don't understand frames; they can't do client-side image maps; many types of dynamic pages are beyond them; they know nothing of JavaScript. Robots can't really interact with your pages: they can't click on buttons, and they can't enter passwords. In fact, they can only do the simplest of things on your website: look at text and follow links. Your human visitors need clear, easy-to-understand content and navigation on your pages; search engine robots need that same kind of clarity.

Looking at what your visitors and the robots need, you can easily see how making your website "search engine friendly", also makes the website visitor friendly.

For example, one project I worked on had many validation problems. Because of the huge number of errors generated by problems in the source code, the search engine robots were unable to index the web page, and in particular, a section of text with keyword phrases identified specifically for this page. Ironically, human users had problems with the page as well. Since humans are smart, they could work around the problem, but the robots could not. Fixing the source code corrected the situation for human and automated visitors.

There are several tools available to check your HTML code. One of the easiest to use is published by the W3C (http://validator.w3.org/). While you're there, you can also validate your CSS code at W3C's page for CSS (http://jigsaw.w3.org/css-validator/). The reports will tell you what source code needs to be fixed on your web page. One extra or unclosed tag can cause problems. With valid code, you make it easier for your human visitors and search engine robots can travel through your website and index your pages without source code errors stopping them in their tracks. How many times have you visited a website, only to find something broken when going through the web pages? Too many too count, I'm sure. Validating your pages makes everything easier for your website to get noticed.

As I said before, what works for your website visitors works for the search engine robots. Usability is the key for both your human visitors and automated robots. Why not provide the best chance for optimum viewing by both?

Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing, a Search Engine Marketing company serving small businesses. She has specialized in Search Engine Optimization since 1998, including three years as the Search Engine Specialist for O'Reilly Media, Inc., a technical book publishing company.

Copyright © 2002-2005 Search Innovation Marketing (http://www.searchinnovation.com) - All Rights Reserved.

Permission to reprint this article is granted if the article is reproduced in its entirety, without modification, including the bio information. Please include a hyperlink to http://www.searchinnovation.com when using this article in newsletters or online.

home | site map
© 2022