My CMS

Tag: search engines

  • 301 Redirect Those Old URL’s » SEO Factor

    « Description Tag Optimization – Beginning SEO Podcast by Lee Odden » 22 May 2007

    Today I consulted on an issue that comes up all the time, but I don’t think I have ever blogged about it. 301 Redirects. More specifically, in reference to moving a site to a new domain name.

    For whatever reason, you want to change the domain name of the site, but need to keep the link juice, traffic from bookmarks, and the rankings gained by the old domain name. The way to do this is with the use of a 301 redirect. This is a method used in the .htaccess file in the root directory of your website. A 301 redirect tells a visiting browser (and therefore search engines), that “the site has moved forevers and evers, and it is now here.”

    In reference to the ranking and link juice, the amount of time for recuperation can vary. I’ve experienced as little as a week, and as long as 2 months. Mind your SE’s and O’s and you should be fine.

    This is the only acceptable way of redirecting in the eyes of a search engine. I say “acceptable” because, skilled black hatters aside, the search engines will not play well with other methods, and will likely penalize or drop the site all together with the use of other methods.

    So, how do we implement a 301 redirect correctly? Like so:

    • Navigate to your website’s root directory (this is normally where your index file is)
    • Look for a file called .htaccess (notice that there is no extension)
    • If there isn’t one, open notepad, and make one (again, be mindful that there is no extension)
    • In the .htaccess file, write this:

    Redirect 301 /old.com http://www.newsite.com

    Notice we are saying Redirect of a 301 type from the /old.com to http://www.newsite.com. Pretty simple, huh? Make sure of a few things. First, that you are not including the whole URL in the first part, but are in the second; also, that you are including a single space between the old site and new site.

    Well, because that was so simply and quickly explained, I would like to explore a few reasons why we would want to do this.

    One of the most common these days (the days of social and viral marketing) is that you made some marketing efforts that will bring much to a domain in the way of hits, diggs, and then links. Well, eventually that marketing effort will pass, and the need for that page will likely fade. So why not take advantage of all those links you got, and send them to another of your pages?

    Maybe you decided that the domain name you had before didn’t mesh well with your company name, or the domain name you have been waiting to become available has finally done so. Pick up the new domain name, and 301 redirect it to the old (or vice versa if you are wanting to get that one ranked).

    I also want to discuss a few of the methods that are not accepted by the search engines.

    You can make use of a “meta redirect,” by placing the necessarry code in the head of your site. Doing so allows you to set the amount of time before the browser redirects to the new page. This will provide the browser with the 200 OK status on both of the pages. This causes a problem because the search engine will want to index both of those pages. You could (if you were so inclined) setup a whole bunch of text and meta for whatever terms you wanted for the benefit of a search engine, and direct the “real” visitor to the page you want. As such, this method will get the/both sites penalized in most cases.

    You can do the “sneaky javascript” thing, but as you can see from that link (it’s an old method, obviously) it’s a bad idea. There are a few updated methods, but I’m not too well versed in those areas. You think SEO is tricky? Try being a skilled Black Hat; Now that’s hard work.

    I hope this helps with any questions regarding 301 redirects. If you have anymore, please feel free to comment or let me know.

    You must be logged in to post a comment.

  • SEO Factor Blog

    November 16, 2006 Yahoo! Google, and MSN will agree on the Sitemap protocol previously supported by Google alone. We all know what a sitemap is, and how it helps your site getting indexed. If not read here. The protocol is basically an XML file uploaded to your root directory, mapping your site and giving a little information on how often it changes, when it was last updated, etc. Though it will not improve rankings in a search engines, it will help with the search engine’s attempt at crawling your site, thus increasing it’s index potential.

    From sitemaps.org, the site released by all of the search engines:

    “Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.

    Web crawlers usually discover pages from links within the site and from other sites. Sitemaps supplement this data to allow crawlers that support Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using the associated metadata. Using the Sitemap protocol does not guarantee that web pages are included in search engines, but provides hints for web crawlers to do a better job of crawling your site.”

    This marks the second time that the three big guys teamed up for a common, and greater benefit to the online community. The first was with the recognizing of the ‘nofollow’ tag.

    RSS feed for comments on this post.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/56/

    Leave a Comment

    Line and paragraph breaks automatic, HTML allowed:

  • Untitled

    We will write an article on your site, and release it through a large number of distributors, that will in turn release it further to the online community. These articles will be optimized to match the subject and desired terms on your site, and will also provide links to your site for link popularity. Some website designers would like to have content on their site, so they use the content from press releases. This enables them to get quality content on a subject, and the links to your site and business information are still included, giving you the credit and traffic. There are also a great number of web surfers that sift through press releases looking for news or new services/products. This will allow for your site to prevail in the traffic from people that are reading your press release.

    Our Press Release Service: $300

    • An article on writtin on your site with up to 700 words
    • Optimization of the text of the article to include and target the desired search terms
    • Information on your business, and links leading to your site
    • Submission of your article to 25 major article distributors

    Newly created and optimized sites need to be submitted to the search engines in order to increase it’s visibility. Directories are a great way to get more traffic and link popularity for your site, thus helping to increase the ranking in search engines like Google.

    Submit Lite: $50

    • Submissions to Major search engines including Google, Yahoo!, MSN, Ask.com, and many more.
    • Submissions to 50-60 quality general directories
    • Submissions to at least 5 industry specific directories

    Submit Pro: $75

    • Submissions to Major search engines including Google, Yahoo!, MSN, Ask.com, and many more.
    • Submissions to 75-100 quality general directories
    • Submissions to at least 5 industry specific directories
  • SEO Factor Blog

    October 9, 2006 Google filed for a patent recently that would allow for algorithmic ranking, along with the efforts of human editing. This would mean that the build of the work in determining ranking and results would rest with a computer, but would be helped with the integration of editorial opinion. You can read the patent details here. So, what does this mean? This means that there is a chance that a human will make a piece of the decision on how relevant or beneficial a site would be in reference to it’s visibility online. So it would be a good idea to make sure that your sites are indeed built for the human. This shouldn’t be anything new, but will help to re enforce what is considered acceptable SEO practice. Sites shouldn’t really be built for the search engines, but for the end user/visitor/buyer. Even as a consultant, I often tell my clients that we will not be optimizing a site for the search engines as my title implies, but optimizing a site to show the search engines what the site is all about. When you concentrate on the search engines, you neglect what is really important, and what the site was created for in the first place, the customer. This patent will force that along because there will be a person looking at your site and giving an opinion on how important the site is to the online community, how relevant it is to it’s desired search terms, and whether or not it will be considered an authoritative source of information or services. So if you’re creating a site for the algorithmic machine, then you’re probably going to be disappointed when someone at Google looks at your site with destain and sends it to the bottom of the list.

    However. Just because a search engine submits for a patent, it doesn’t mean that they will implement any of the changes. These are public records, and could very well be a ‘Art of War’ kind of situation. Smoke and Mirrors. Illusions of the mind. Either way, the same rule holds true. Make a nice site, for your customers. Not some stupid robot.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/34/

    Line and paragraph breaks automatic, HTML allowed:

  • SEO Factor Blog

    October 6, 2006 We have discussed meta tags and the such. Now we will discuss the content that is on your homepage. The content is in part how the search engines know what your site is all about. They can’t see pictures, so naturally they will have to read your site. I do want to point out, with saying that, a picture of words will not count. This is often done in Photoshop, Flash, or some other imaging software to create effects on a website. So be sure that the text on the site is “real” HTML text. There are a lot of debates about the number of words that should appear on the homepage. 250 words is thrown around a lot. Try to think of it like this. If you are an expert in an area, you’re probably gonna have a lot to say about a subject. So, if you have a lot to say about a subject, the search engines will feel that you are an expert in that area, and likely want to show you first. Google themselves said 500 words one time, but again, don’t do things for the search engines, do it for your customer(s). And you obviously want to educate your customers right. We like to throw out the 250 words number because it’s not a whole lot of work to get that much, and it will give ample opportunity to use the terms that you want to be found under. Now we come to density. You are proving to a search engine that you are relevant to a certain term. What better way than to use that term? Let’s stop for just a second. I’m gonna repeat something again. If I could, I would put a disclaimer between every single word I write so that it get hammered into your brain. You don’t want to do things for the search engines. You want to prove to them that you are relevant, not trick them into thinking so. There is a way of density known as “stuffing.” Basically this means that you are placing your terms all over the place to show your relevancy. Don’t get caught up in that mind set. Let us continue. This is why it’s a good idea to still have a keywords tag. You want to be mindful of the terms that you want to be found under. So try to use these terms at a minimum of one time each. Your title is the most important, so should contain at least five of your most important search terms. Yahoo! especially puts priority on the title tag. Don’t use more than 80 characters, some reports are saying 70. You will want to have a good percentage of density of your title words in your text. Try for around 5%, but don’t exceed 10%-12%. Too much looks like spam or stuffing. So, for all you 6th grade dropouts, that would be 5 times for every 100 words on the homepage. Reassure your customers, this sounds like a lot, but when you write out 100 words, 5 instances of a word is not a lot at all. You will likely use the word “the” 15-20 times.

    It’s also a good idea to give weight to your title words. The H1 tag has the most weight, bolding words is a good idea and bold internal links to a page that is about that term is great. Your main search terms should also be as close, if not at the top of your text, and should all be in close proximity of each other if at all possible. Again, you hear to place the exact title on the top of your site. Think about this. If my title is “funny, stupid, mean, crazy, offensive, shirts, Jacksonville, fl,” how is that gonna look on your site. Not very good I say. Just be sure to show weight and importance to these terms. Don’t overdue it. Remember, you want your site to look good for your customer at the same time revealing what your site is all about to the search engines.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/5/

    Line and paragraph breaks automatic, HTML allowed:

  • SEO Factor Blog

    October 6, 2006 I was very recently asked by a friend about alt tags in reference to using them to increase keyword relevancy. I answered, and she wanted to know if it was bad to stuff them with keywords (really funny that she should use the word ‘stuffing.’ You’ll see later), and if so, how bad. I hope this helps explain it. Alt tags were created and implemented for Internet browsers that don’t see images, and used by those with seeing disabilities. So, if I am blind, and I go to a site, my computer will tell me what the picture on a site is (whatever is written in the alt tags). The code is as such: whatever Being that this is a function for the greater of the Internet, a lot of search engines looked at this tag for a number of reasons. First, no search engines can see images, so they use that tag to tell what the image is. Second, and Google is known for this, a proper alt tag would reflect positively on a website as it shows the intent of it’s creator by accommodating the disabilities of some web surfers, thus making for a better online community. Because of the search engine’s view of this tag, we used to (used to as in a long long time ago) be able to stuff the alt tag with a lot of keywords and such for higher rankings. The search engines thought that if “hey, they have a picture of a car on this website,” proving further that the site was relevant to the term ‘car.’ The search engines caught on rather quickly, and now will actually penalize for such offenses. The only thing they want to see in the alt tag is a very short, very specific description of that image. It is a common method to place the description tag, or a lot of keywords in the alt tags. This, by definition, is known as “keyword stuffing.” It’s a giant no no in SEO. The search engines are seeing a large repetition of the description tag exactly as it is written in the meta. This on copy alone is bad. They are also seeing that every alt tag is the same. So they ask, “why does this site have the exact same picture on the page so many times? Oh, I don’t think it does. Nobody would do that. So I deduce that this site is just stuffing the terms. Oh wait, it also happens to be the description tag. Let’s stamp that with our ‘Seal of Disapproval’.” Google also sees this as taking advantage and misusing a function that is in place to help those who need that help. Hence the severity in which they will sometimes penalize.

    So, Unamed Friend, to answer your question on stuffing the alt tags. It is bad.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/2/

    Line and paragraph breaks automatic, HTML allowed:

  • SEO Factor Blog

    October 9, 2006 Often times we get so caught up in marketing our site and making it aesthetically pleasing, we forget one of, if not THE most important part of SEO. Content. The search engines can not see pictures, so the only way to tell them what your site is all about is to provide that information to them via text. But why is it so difficult to put text on our sites? During the creation of a site, we often care more about how it looks than how it functions. Flash, in my opinion, can be used to create some of the most beautiful sites out there. We get so drawn into making nice layouts with neat animations, we completely neglect the fact that we are building a site for the world, not just us. The content on your site should actually be the first thing you think of when optimizing it. You need to convey to the search engines that your site is worthy of ranking, and that you are here to offer something to the masses. I often give this advice to customers, and get a great deal of fearful feedback. I hear a lot of “I don’t even know what to write.” Well, your site is an extension of your business/service/whatever, and as such you are likely to be well versed in that area. But I think people get caught up in the idea that the text is going to be placed on the site, promoting the attention to variables such as what font to use, what color, how it will look, etc. The best advice that I can give in reference to thinking of text, is to close your site, use notepad (not Word. This creates problems when deciding to copy and paste), or a pen and paper (these do exist you know), and just put your thoughts to reality. Just write. Write about your business, yourself, your products, the history of your company, whatever. Just write. Before you know it, you will have well over 800 words of cryptic information all about your industry. As I write this, I am not thinking about my blog or my site, but only what I am trying to convey to you, the reader. Once you have all that information, organize it. Make it mean something, and make sense. Given enough thought, it will not be too difficult to come up with at least 500 words, which is a good amount of text to be placed on a website. Once the content is on the site, you can obviously tweak it here and there to make proper use of your search terms (this will most likely happen anyway. You are writing on or about the same subject as your site, right?). And once that is done, you can then figure out what the layout, spacing, font, and all the other wonderful cosmetics the online world can offer will be.

    As with all things online, this is easier written than done. But try this method out and see if it helps. To me, writing is my very weakest point, and this helps a great deal. Content is so very important to search engine optimization, and should be treated as such.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/29/

    Line and paragraph breaks automatic, HTML allowed:

  • SEO Factor Blog

    October 9, 2006 I’m really not sure where this information is coming from, but I’m getting a lot of clients asking me about poison words. This is an excellent example of a Google Myth. There are no such things as poison words. There are ‘stop’ words, but these don’t even have anything to do with your ranking as much as it does making good use of the space you have. I want to debunk this myth right now. There are tools and “experts” that say certain words in your title, description, or meta tags would get your site penalized by Google. These words include “lingerie,” “adult,” “sex,” “free,” and various school yard no no words. I’m not really sure where or why this myth started. Actually I think I do know why. Google is known for their lack of regard for porn and spammy sites. It’s not really that far-fetched to think that there would be trigger words that would mark a site as such material and ban it.

    There is indeed a service provided by Google called SafeSearch, but it doesn’t really work in the fashion described above. This is a manual filter that can be used in an attempt to omit certain types of sites from appearing on a Google search result.

    So, let’s get to the matter at hand. Google would not want to ban a site that contains the word(s) “lingerie,” “sexy,” or any variation as such. Think about the massive industry as lingerie sales in retail form. ‘Victoria’s Secret’ comes to mind. “Sex” is not exactly fair to target either, what with the ever-growing campaign for sex education and disease awareness that various organizations and government agencies that have been pushing for the last 30 years or so. “Adult,” well that’s just stupid. I could understand “free” as that is a word often used by spammers, but think of all the legitimate applications. Just too many to ban a word entirely. If you wanna test this yourself, do a search for “free lingerie” in Google. Check on the meta of all the sites on page one. I guess this post is coming to an end as I soon will be resting my case. I did mention “stop words.” Unlike poison words, these are words that a search engine like Google will ignore. Words like “and,” “for,” “the,” and words of the like. There is no penalty for including these in your meta. There is however a length to which your meta can exceed for less than satisfactory on-page optimization efforts. So it makes sense to be mindful of these words in order to conserve the space you are given for your meta.

    In closing, everything written or told about SEO should not be believed. The search engines are often a mystery, and naturally, we as humans will think of fantastic reasons and subjects for this mystery.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/37/

    Line and paragraph breaks automatic, HTML allowed:

  • SEO Factor Blog

    October 7, 2006 In the creation of a site, many people make use of frames. Frames are a way of making one page that will stay static, but a chunk of which will be used to call information from other pages. The use of frames is often because it is a person’s first site, or their relative made it after graduating a website design program or just picked up a book. This isn’t really a bad thing because most website design/HTML books don’t even go into SEO, and the use of frames is one of the very first subjects covered. It can be a neat little method of keeping a common look throughout the site before learning CSS. More on creating frames can be found here at my favorite online tutorial, Tizag.

    The problem with frames is that some browsers don’t play well with them, and the information on the page is often within a frame that can not be seen. Here is a way to get around this problem, posted on Search Engine Watch, while still making use of the frames method. Keep in mind that most methods of using frames and making them SEO friendly, will only help so much. And it’s not really that much.

    But maybe you have talking to someone that doesn’t want to give up the frames no matter what you say. One good method is the creation of landing pages. (note: not doorway pages or anything that is a “trick.”) You can create a few pages that have the same look and feel as the framed pages, and place the bulk of relevant text on those pages. They will be the first set of pages on the site, (that means replacing the homepage) and will link to the framed pages. Then these pages can be submitted to the search engines. I also think that this is a cheap and ‘shortcutty’ method. It leaves a void when a surfer is moving from framed to unframed pages, and decreases the amount of acceptable information that the search engines can see.

    In my opinion it’s probably best to make good use of CSS if you want a common feel on the pages of your site. This will give the desired effects, and work very well with a surfer and search engine alike. Keep in mind my opinion on design should be taken lightly. I’m very analytical, so “pretty” is not a word I implement often. I am very lucky to be involved with people that have an incredible eye for design and a comparable understanding of SEO. The make the site, I make the site work. Now that’s a partnership.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/11/

    Line and paragraph breaks automatic, HTML allowed: