October 7, 2006 CSS (Cascading Style Sheets) is a very great way to design and layout a site in terms of SEO. We have discussed how the crawlers will read through the HTML of your site and how this helps determine how you will be ranked. It’s best to make a website easy to read for these crawlers so as to provide the pertinent information to them promptly. Though the algorithm that determines your ranking is very complex, the crawler that crawls your site is somewhat simple. The search engine crawlers are much like Internet browsers with a few differences. First, the browser that is used to crawl is not updated on a basis comparable to personal browsers like IE or Firefox. To make an update to a search engine browser is such a tedious and daunting task, and the idea that so much responsibility relies on it, that they roll out these updates a little less frequently. Second, the search engine crawlers can only see text. They can not see images, so they rely heavily on the alt text on your images.
That being said, you want to create an easily crawled environment with as little unneeded tags and code as possible. CSS can help you do this. You see, CSS helps define standards for many design and layout features that will be used on your site like font color, background color, placement of content blocks, etc. Obviously all of these things are defined during the actual implementation of such features in HTML. This can cause a great deal of extra code, especially on larger and more detailed pages. In CSS, they can all be defined one time, and used throughout the site by calling to the definitions. This, obviously, alleviates a great deal of code on the site allowing for faster loading time and ease of a crawler…umm….crawl. Check out the CSS tutorial here at Tizag.
Now, CSS can be implemented in three ways. Internally there are 2 ways. You can make calls to a style while writing the code. This will happen throughout the coding process, and it will take the amount of code down a bit, but not as much as the other 2 ways. The second internal way is to make the definitions in the HEAD, and the rest of the site will call to this area. This will alleviate a great deal of code, but poses another issue. The crawlers read like you and I. Left to right, top to bottom. Putting a lot of information in the HEAD will make for that much more code the crawler has to get through. Again, this isn’t entirely bad, but can be taken just one step further. The third way, and in my opinion the best, is to implement the CSS in a separate file all together, and call to the file for the definition. This will relieve so much more code, and increase load time a great deal. Again, more can be read on this at Tizag.
Now I want to touch on a subject that is not pressed enough. SEO procedures known as «black hat» are any methods used to trick or otherwise exploit a search engine to get better rankings and traffic. These methods rarely work and if it does not for long at all. Once one of these methods are caught by the search engines, you are left with very few options to get out of the grave you just dug. I think I will post on this next, but for now just be wary that this is a bad idea. That being said, using CSS you could theoretically place text on a site, and cover it completely so as to hide it from the viewer, yet showing to the search engines. It is widely believed that the search engines can not read CSS, but I don’t believe this to be entirely true. There is a little debate on this, and some good evidence on either side of the opinion, but if they can’t read it now, they will one day soon. And that will be a bad day for a website using this tactic.
I hope this helps in understanding how CSS can help your SEO jobs. As always, if there are any questions at all, please feel free to find or e-mail me. I’m always in the mood to talk SEO.
The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/9/
Line and paragraph breaks automatic, HTML allowed: