My CMS

Tag: paragraph breaks automatic

  • SEO Factor Blog

    February 6, 2007 Ok, so we all know how important IBL’s are for ranking in Google. I’ve gone over the importance and what factors are taken into account, but one of the most frustrating thing was the idea that you never really had an idea at how many links you were getting credit for from Google. The ‘link:site’ operator was never all inclusive because of the metric’s weight in determining your ranking. Well….Google will now let you, and only you, see this information. So long as you can prove you own the site with Google’s ‘Webmaster Tools.’

    More here.

    This is incredible news. This should help us all understand a bit more on how Google determines where you stand. But also keep in mind that this is new, and as such, will not be void of inconsistencies or errors. But again, and excellent step forward in understanding how the Big G works.

    RSS feed for comments on this post.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/59/

    Leave a Comment

    Line and paragraph breaks automatic, HTML allowed:

  • SEO Factor Blog

    October 9, 2006 There are a number of big names in the SEO and online marketing industry. A lot of times I throw them around as answers to “people I would like to meet” or “what source do you have for this information?” More often than not, though, there are a handful of names that continue to come into play. After about 2 years of all of your funny looks, I thought it was time I let you just a little further into my world. So, here are a few introduction. Oh, and a lot of “SEO Experts” claim to know these people on a personal level so as to boost their credibility in hopes of making a sale. This is certainly not the case most often, and especially in mine.

    Danny Sullivan

    • Creator of Search Engine Watch, Journalist, and Internet Consultant.

    Danny Sullivan became involved in SEO, unbeknownst to him, in 1995, when he had a client that was complaining that his website was nowhere to be found in the search engines. At this time, SEO was little more than a thought. There were no handy-dandy tools, no SEO forums, and no guides to help in the matter of exposure online. So what did Danny do? He decided to research what it takes for a search engine to decide to rank sites higher. He modified, tracked, and re modified his sites so as to determine what changes had what effects. After a long period of time, and a large amount of answers, Danny posted his information online, and it soon became the very first “A Webmaster’s Guide To Search Engines.” There had been nothing of this information beforehand, and this may have indeed jump started this whole industry. While checking out the pictures from this year’s Google Dance, his appeared and my daughter asked “Daddy, who is that guy.” To which I answered, “he is the guy that made Daddy’s job possible.” With so much acclaim and even more e-mail incentive from grateful website designers, Danny maintained the information online for the world to partake. As time passed, he dug deeper to find what it takes to index sites, increase ranking, increase traffic, and how the search engines work overall.

    In 1997, Danny launched what is now arguably the most important SEO information site online, Search Engine Watch. It contained his “A Webmaster’s Guide To Search Engines,” and so much more. SEW has changed ownership (as many great things in business do) but is still maintained and edited by Danny Sullivan himself. Years later, this site is one of the very first that I visit every single, and throughout my day. There are very few times that I can stumble onto a question that is not answered here.

    Matt Cutts

    • Google Software Engineer, Superior Blogger, Nice Guy, CIA Operative.

    Matt Cutts began his employment with Google in 2000, where he worked as a software engineer, and ended up creating Google’s SafeSearch (Google’s family filter). He deals primarily with issues of quality (as in spammers) and webmaster concerns (like the algorithm concerning ranking and penalization). Matt names wasn’t nearly as famous (or infamous depending on your perspective) until he started his blog. Using his blog, Matt talks SEO, Google’s products, life at Google, and a lot of overall nerdy stuff. When his pic came up on my computer, and my daughter asked of him, I replied, “that’s the son of B**** that makes Daddy’s job really hard.” With respect of course.

    *Little more is known of Matt because he works for the NSA under the guise of a Google employee, so he can read and distribute all of your emails to the government.* *I am totally kidding. Being in the spotlight carries a lot of responsibilities and burdens, one of which is dealing with petty rumors and crass people. There is a site (that I am not going to justify with naming here) that claims my joke, sorta, in a very sorry attempt to degrade Google’s integrity. Matt did indeed work for the Department of Defense while going to school for his M. S. degree from the University of North Carolina at Chapel Hill. That plus the B. S. degrees from the University of Kentucky in both mathematics and computer science, means he’s a really smart guy. He worked for the DoD as an elective, providing a few months of study, and then a few in the work environment. Think about it though, even if the rumors are true, and the government used Matt to get your Gmails and read through them, do you really think that’s the worst thing our government has done? Probably not. Matt is widely noted as an extremely nice and approachable guy. Many Google employees will openly admit that they hate waling the floors of seminars and conventions with a Google badge on, as they soon become the target for angry webmasters. Matt easily gets the most of this activity, and returns it with a smile.

    I will likely be doing an on-going posting on some other big names in marketing and the Internet industry. I believe that it is a very good idea to know the people you work with, even if not directly.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/33/

    Line and paragraph breaks automatic, HTML allowed:

  • SEO Factor Blog

    October 9, 2006 As with all things in the website world, the idea, practice, and importance of link popularity is always changing. The rule used to simply be, “get as many links as you can from anywhere at all.” This is no longer so. After that rule, it became, “make sure the site linking to you is relevant in some way.” While this still holds true, the term “relevant” has undergone some modifications. It has never been to difficult to find sites that are remotely related to another with a little bit of time and initiative. So it was easy to make this rule work in the favor of the SEO Specialist. Very soon after that rule, it became the next to most recent one, “make sure it’s a relevant site, that has a higher PR (read more inbound links) than your site.” This made it just a little bit more tedious and tricky. Hence the price for a “real” link popularity service. But it was still easy to manipulate in your favor. But you know what’s not easy to manipulate? Someone else’s page all together.

    Enter the new definition of “relevant” in the website world. What Google is leaning toward is not only link popularity, but that the relevant site should make use of internal linking within their content, using the desired search terms for the linked site. Confused? Let’s digress.

    I have a pet store site. I have a good ranking for all my “dog” terms. (dog food, leashes, whatever). But the terms for all my “cat” products result in nothing. So I will need to modify my link-building campaign a little to gear toward the feline-loving community. During my search I get a link on some other pet store in a different state. They agree to make the anchor tag reflect my desire. This helps, but more importantly they make use of internal linking with key terms such as “cat leashes” or “cat nip” or “cat shaving device.” (I couldn’t help it) The fact that this site makes use of these terms in internal linking give it a lot more relevance in relation to my inbound link.

    So, as this goes further, be ever more vigilant when approaching another site in reference to link exchanging/requesting. I know this post was a bit tricky, so as always, please feel free to e-mail me.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/32/

    Line and paragraph breaks automatic, HTML allowed:

  • SEO Factor Blog

    October 10, 2006 Google goes through yet another growth spurt with it’s recent acquisition of YouTube. Eric Schmidt, Chief Executive Officer of Google, justifies the $1.65 billion payment with “Our companies share similar values; we both always put our users first and are committed to innovating to improve their experience. Together, we are natural partners to offer a compelling media entertainment service to users, content owners and advertisers.” This is nothing new for Google. They have a history of buying products, software and even industries to so to integrate them and make it their own. You may remember when Google purchased SketchUp, a CAD software, not only using it to enhance Google Earth, but providing the software for free to the public. That’s pretty neat if you ask me. But what will this mean for YouTube? Google suggests that the company will remain independent of Google, keeping it’s brand recognition and retaining it’s employees. My vote is that Google will use YouTube to further enhance and market Google Video, while making use of the advertising real estate that YouTube can provide. This may actually be the last of acquisitions for a while though, as Sergey Brin, Co-Founder & President, stated that they are going to back off the mergers and acquisitions for a while to concentrate more of their efforts to the search engine itself. This is incredible news to me, as we are seeing a lot of pitfalls and downward spirals with Google’s searching capabilities as of late, and I for one would like to see a resolution.

    We shall see.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/40/

    Line and paragraph breaks automatic, HTML allowed:

  • SEO Factor Blog

    October 9, 2006 It seems that Yahoo!’s organic listings include not only “relevant” results (and I use quotes for a reason), but they also show a Pay-Per-Click service.

    Through Search Submit Express you can submit your listings for consideration to appear in algorithmic search results powered by the Yahoo! search engine. –Yahoo! in reference to the Search Submit Express Service

    You can read the full details here.

    This means that being relevant is not good enough for Yahoo!, because not only does someone who pays money have the option of a sponsored result, but if you would like to pay a little less, but more than nothing, you can also get results from the area that should be reserved for the organic listings.

    No bidding. Listings generated automatically based on the relevance of your page content to search terms. –Yahoo! in reference to the Search Submit Express Service

    If this were the case (relevancy matters) then why should I have to pay anything at all?

    So how is this fair? It’s not. Actually, according to the FTC, there is a chance that this practice is illegal. You see, if there is a section for organic listings, and it is laced with paid inclusion results, and there is nothing in that area stating that the listings may be of a paid inclusion sort, that goes against FTC Guidelines in the matter.

    Complement your sponsored search campaign and extend your reach by capturing leads from algorithmic search results. –Yahoo! in reference to the Search Submit Express Service

    So I will indeed get more leads from this service? But how will I get more leads if I don’t get any higher rankings? According to these guidelines, and the Yahoo! search submit express page, paying for this service does not result in better rankings. I’ll tell you about my proof, but first consider this: if it does not provide higher results, then there is absolutely no benefit to pay at all. Why would you have to pay per click in an area that is reserved for the regular listings anyway? There would be no benefit for the person paying, or the search engine. The search engine would not make money as a site that was not listing in the first place would do no better. And the person paying (and there is an initial fee) would not get any clicks because they were not listing highly anyway. Unless this is not the case. I am working now, and have worked on sites, a great number of times with this service. The unfortunate truth is that once included, the results are indeed much better. As soon as we stop those listings, the rankings drop as well. We opt the customer back in, they get rankings again. Odd.

    But, it is difficult to really complain as this is indeed the game that we play, and the search engines make all the rules. I do feel that this is a little misleading and unfair to those that have very relevant and important sites that may not get ranked because they aren’t giving Yahoo! money. Hopefully this will help you understand a bit more about Yahoo!, and some of the reasons that your rankings are the way they are.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/35/

    Line and paragraph breaks automatic, HTML allowed:

  • SEO Factor Blog

    October 9, 2006 Google filed for a patent recently that would allow for algorithmic ranking, along with the efforts of human editing. This would mean that the build of the work in determining ranking and results would rest with a computer, but would be helped with the integration of editorial opinion. You can read the patent details here. So, what does this mean? This means that there is a chance that a human will make a piece of the decision on how relevant or beneficial a site would be in reference to it’s visibility online. So it would be a good idea to make sure that your sites are indeed built for the human. This shouldn’t be anything new, but will help to re enforce what is considered acceptable SEO practice. Sites shouldn’t really be built for the search engines, but for the end user/visitor/buyer. Even as a consultant, I often tell my clients that we will not be optimizing a site for the search engines as my title implies, but optimizing a site to show the search engines what the site is all about. When you concentrate on the search engines, you neglect what is really important, and what the site was created for in the first place, the customer. This patent will force that along because there will be a person looking at your site and giving an opinion on how important the site is to the online community, how relevant it is to it’s desired search terms, and whether or not it will be considered an authoritative source of information or services. So if you’re creating a site for the algorithmic machine, then you’re probably going to be disappointed when someone at Google looks at your site with destain and sends it to the bottom of the list.

    However. Just because a search engine submits for a patent, it doesn’t mean that they will implement any of the changes. These are public records, and could very well be a ‘Art of War’ kind of situation. Smoke and Mirrors. Illusions of the mind. Either way, the same rule holds true. Make a nice site, for your customers. Not some stupid robot.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/34/

    Line and paragraph breaks automatic, HTML allowed:

  • SEO Factor Blog

    October 9, 2006 A lot of times people ask me where I get my information or where they should start reading to learn more about SEO. I usually reply with a half cocky/half joking “read my blog, it has all you need to know.”

    Obviously this is not the truth, as no one person can know all there is to know about Search Engine Optimization (SEO). And even if there were, that person would not likely also know everything there is to know about Search Engine Marketing (SEM). So, giving credit to Rand Fishkin at SEOmoz, the following link is to his list of the blogs he reads. Mine is not quite as extensive due to particular tastes and such, but this is a great way to go through a lot of the bigger names, and make a list of your own.

    Ranking 50 Top Blogs in the Search Space

    WebproWorld also has a great post by incredible help on ‘SEO, Where Do I Begin?’

    I do want to point something out about resources though. I can honestly and humbly say that the list and most of the mods at WebproWorld know more of SEO, have more experience, and could likely out rank me any day. But in this field, learning what it is, and knowing what it is are 2 different things. I could claim all day long that the information I give is correct, that I have ranked many sites well, and have a 100% success rate (I really don’t). But in this whole blog have I ever once mentioned a single URL in my hands? Nope, not once. So why believe me? Those I work with and reading this do because they have seen it. But if you are visiting from outside of our little circle, you shouldn’t. I try very hard to make sure I am conveying information correctly, but it is always best to try these things out yourself before holding it as gospel. There is a saying in physics (yes Billy, it’s a saying in mathematics too…jerk) that until something has been measured, and assigned a number, it does not exist. Or something like that, it’s been a while since I was in school. All I’m saying is that all the book knowledge that can possibly be attained, is no match for real-world experience. So just because someone tells you it’s true, doesn’t mean it is until you have tried it yourself.

    I hope these resources help. There is a plethora of great information out there. You just have to reach out and grab it.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/36/

    Line and paragraph breaks automatic, HTML allowed:

  • SEO Factor Blog

    October 18, 2006 The following is a link to a post at SEOMoz. Rand started to compile information for on-page optimization efforts, and provided all of us with this invaluable information.

    Rand’s List of Search Engine Ranking Factors

    RSS feed for comments on this post.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/42/

    Leave a Comment

    Line and paragraph breaks automatic, HTML allowed:

  • SEO Factor Blog

    October 9, 2006 There are a bunch of them. I would like to share some of my favorites. First, though, I would like to say a thing. All tools are not 100%. Nothing is. Tools for SEO are online and rely on information given by different sources. For example, there are tools that check link popularity. Google will not ever let you know how many links they consider. This would give a little info on how they determine ranking and they don’t want that. Some tools check for ranking. Ranking is a relative term. Google has many different data centers, and a site that ranks 23 for a keyword on one center, may show a different rank on a different center. So let’s look at some neat tools to use.

    Site Report Card

    This one is a pretty cool overall tool that shows a few different aspects of a site that can and likely do affect rankings. It shows different aspects on a scale of 0-10. It’s a good idea to get everything as close to a 10 as possible, but not always easily done as some Flash conventions are not seen as acceptable for this one. It also shows link popularity and inclusion which is a good way to judge how many of your pages are indexed. Also, the spell check shows less than satisfactory results sometimes, as there are words that we say and type that aren’t in the tool’s dictionary.

    Overture Keyword Selection

    This is a tool that was setup to show how many times a certain word or term was searched for in a month’s time. This can help decide on what terms you want to optimize and market your site for. If you were thinking about a term that got 10 searched last month, might want to look at other terms. If there were 17,234,879,240 searches, might want to look at variation of a term with the next tool.

    Google Keyword Tool

    This is one of my favorites. This tool will show you a vast number of variations of a word or term, so that you aren’t trying to get your site to be number 1 on Google with the term “cell phone.” So you can check into other terms, use the previous search tool from overture, and decide on your best plan of action for optimization and marketing.

    Search Engine Watch, SEO Book, WebProWorld Sometimes, the best tool in the world is the help of others. These are two of my most favoritest forums and my favorite blog (next to mine, of course). I have yet to even ask a question that was not answered, and the amount of information provided here is immense. Make sure to get more than one opinion though. In the real world, people have a tendency to speak on things that they don’t completely understand or even remotely have experience in. Also, learn to know the mods on these. You will start to notice that the majority of the posts are answered by a few individuals, and they are incredibly smart.

    There are also a number of other tools out there that “check meta tags.” Please be wary of these. I’m not saying not to use, but I am saying that everybody has their idea of what is acceptable. WideXL, for example will tell you there are too many keywords or shine out your “poison” words. First, if I have 21 keywords, don’t tell me I have too many of something Google doesn’t look at. Obviously you probably don’t want to have 5000 keywords as there are some search engines that use them, and a hit is a hit (if it’s relevant). And “lingerie” is not a poison word. Look up “lingerie” in Google. See number 1. Look at their keywords. Yeah.

    So, check out these tools, research your SEO, be mindful of the information and/ or advice you get. Even from here. I try my hardest to keep the information here as correct as I can, but everyone makes mistakes. Unless you have tested it and counted the consequence, it hasn’t really happened in your world.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/14/

    Line and paragraph breaks automatic, HTML allowed:

  • SEO Factor Blog

    October 6, 2006 We have discussed meta tags and the such. Now we will discuss the content that is on your homepage. The content is in part how the search engines know what your site is all about. They can’t see pictures, so naturally they will have to read your site. I do want to point out, with saying that, a picture of words will not count. This is often done in Photoshop, Flash, or some other imaging software to create effects on a website. So be sure that the text on the site is “real” HTML text. There are a lot of debates about the number of words that should appear on the homepage. 250 words is thrown around a lot. Try to think of it like this. If you are an expert in an area, you’re probably gonna have a lot to say about a subject. So, if you have a lot to say about a subject, the search engines will feel that you are an expert in that area, and likely want to show you first. Google themselves said 500 words one time, but again, don’t do things for the search engines, do it for your customer(s). And you obviously want to educate your customers right. We like to throw out the 250 words number because it’s not a whole lot of work to get that much, and it will give ample opportunity to use the terms that you want to be found under. Now we come to density. You are proving to a search engine that you are relevant to a certain term. What better way than to use that term? Let’s stop for just a second. I’m gonna repeat something again. If I could, I would put a disclaimer between every single word I write so that it get hammered into your brain. You don’t want to do things for the search engines. You want to prove to them that you are relevant, not trick them into thinking so. There is a way of density known as “stuffing.” Basically this means that you are placing your terms all over the place to show your relevancy. Don’t get caught up in that mind set. Let us continue. This is why it’s a good idea to still have a keywords tag. You want to be mindful of the terms that you want to be found under. So try to use these terms at a minimum of one time each. Your title is the most important, so should contain at least five of your most important search terms. Yahoo! especially puts priority on the title tag. Don’t use more than 80 characters, some reports are saying 70. You will want to have a good percentage of density of your title words in your text. Try for around 5%, but don’t exceed 10%-12%. Too much looks like spam or stuffing. So, for all you 6th grade dropouts, that would be 5 times for every 100 words on the homepage. Reassure your customers, this sounds like a lot, but when you write out 100 words, 5 instances of a word is not a lot at all. You will likely use the word “the” 15-20 times.

    It’s also a good idea to give weight to your title words. The H1 tag has the most weight, bolding words is a good idea and bold internal links to a page that is about that term is great. Your main search terms should also be as close, if not at the top of your text, and should all be in close proximity of each other if at all possible. Again, you hear to place the exact title on the top of your site. Think about this. If my title is “funny, stupid, mean, crazy, offensive, shirts, Jacksonville, fl,” how is that gonna look on your site. Not very good I say. Just be sure to show weight and importance to these terms. Don’t overdue it. Remember, you want your site to look good for your customer at the same time revealing what your site is all about to the search engines.

    The URI to TrackBack this entry is: http://www.seo-factor.com/Blog/bblog/trackback.php/5/

    Line and paragraph breaks automatic, HTML allowed: