If Search Engines Were Perfect

If search engines were 100% relevant…

There would be no search results. You would only need the I’m Feeling Lucky Button.

Every page on the internet would be a stop page. The searcher would use the SE to find exactly what they were looking for on the page of your site the SE sent them to and would no longer need to browse any other pages on your site.

Affilliate programs would not exist because if searchers needed to buy something the SE would send them directly to the people who sell them.

Internet traffic would drop dramatically because hunting for information wouldn’t be necessary.No new ecommerce sites would emerge to compete because the SEs would automatically send all the people on the Internet to the most reputable online stores.There would be no need for sites that review other sites

Directories like DMOZ would no longer exist

No sites would link to other sites because the visitor wouldn’t be at your site if it wasn’t exactly what they wanted.

Forums and chat wouldn’t be needed because finding the information would become easier than asking for it.The web would be dead.

–>

Trust Factors-Getting To Second Base With Sally Search Engine

Just so no one gets confused I want you all to realize that I do not always practice what I preach. There simply isn’t time for it all. Whenever I create a new site I decide in a written plan what techniques and practices will make me the most money with the least amount of effort for the longest amount of time. Although I have many sites that are over 2 years old, I very rarely put daily promotion work into any of my sites that are over 6 months old. I put a lot of effort into the first six months of a site and then let it perpetuate its own promotion (which is coincidently about when I start getting REALLY sick of staring at the site). This allows me to not only keep my sanity, but to prevent myself from putting too many apples in one basket (I think that’s the way the saying goes). This practice however tends to put a vice over my head when it comes to search engines and that vice is called Trust Factor. One of the greatest strengths you can have in business is knowing your own weaknesses so I’d like to use this article to help us all speculate about what I consider my biggest weakness and possibly yours in SEO.

Factors of Trust RankHere is what factors I think the search engines possibly use to determine your trust factor.

  • Age of Domain
  • Keywords in domain
  • Inbound links from sites that compete for the same keywords
  • Age of Inbound links from sites that compete for the same keywords
  • Stickiness of anchor text of inbound links
  • Ratio of inbound links from sites that don’t compete for the same keywords to sites that does.
  • Links from authority sites
  • Stickiness of links from authority sites
  • Percentage of pages on your site that are on topic with the main page.
  • Outbound links that result in a page competing for your keywords.
  • Site being available in the Google directory.
  • Also note that I believe there are boosts available in trust factors for the size of your site as well as meeting a goal of inbound links.

Using these twelve factors lets assume that they are all presented as equal. Therefore we can derive an algorithm to determine an estimated trust factor for our own sites so we may see how we fair to a scale.

The algorithm

#Age of domain factorIf (Age_Of_Domain > 5 Years) {+10 Trust}Elseif (Age_Of_Domain > 1 Year){+5 Trust}Else {+0 Trust}

#Keywords In Domain FactorIf (Keywords_In_Domain = All Keywords) & (Keywords In Domain 1 ) & (Keywords In Domain <5) {+8 Trust}Else{+0 Trust

}

#Inbound Links From Competing Sites FactorIf (Inbound_Links_From_Competing_Sites >= 100){

+10 Trust

}

Else {

+Trust = Inbound_Links_From_Competing_Sites / 100

}

#Average Age Of Inbound Links From Competing Sites FactorIf (Average_Age_Of_Inbound_Links_From_Competing_Sites > 100 days) {

+10 Trust

}

Else {

+Trust = Average_Age_Of_Inbound_Links_From_Competing/100

}

Stop!

I will stop the algorithm right here because by now you’re catching my point. If anyone actually feels like making a tool of this PLEASE LET ME KNOW! Once you read the algorithm you realize the possibilities. When you are finished there is 144 points possible. If you take your number of points and divide it by 10 and remove the remainder you get a scale of 1-14. Assume that 11-14 points mean you are classified as an authority site. This leaves you with a scale of 1-10 of search engine’s trust factor in your site.

From here comes the leg work and yes I’m just as, if not more, guilty than the rest of you of not putting enough work into this as needed to be done. Taking a look at the assumed factors and putting them to scale; +10 being the best you can be or at least well above the average. +0 having none of the factor. We can at least use this to judge how much search engines trust us.

I would really like to talk to some experts on the matter and have them maybe shed some light on this subject for us. Until then I am forced to listen to who I know best………me.

–>

Google Page Rank Update Settling Down

So the Google Page Rank update seems to be settling down. My sites have be fluxing for about 2 weeks now. A few are still fluxing at the moment but most are starting to become steady. On the plus note just about all my sites have gained at least 1PR. The few that didn’t gain at least stayed the same and had subpages raise. Please post comments on how your sites’ are fairing in this update.

One thing to note is that it seems Google has fixed the problem with giving two different page ranks to the / and the /index.html. HOWEVER! This is funny. Google didn’t seem to fix that problem with the index.php. This forces me to ask one question. What the hell?! Once you figure out how to properly concatenate one how hard could it possibly be to duplicate the process it for the other popular extentions?With the wake of Big Daddy and the PR update I am opening Blue Hat SEO up to site reviews. If you would like to receive some input on your site feel free to either email me or post it as a comment. Whether it be layout reviews or SEO reviews feel free to let me know where your concerns lay and I will be happy to give any help I can. If you email me also be sure to note whether or not you mind your review being posted on BlueHatSEO.com. As long as people don’t mind I would like to make the most part of my reviews public incase I happen to find something that may be of help to others.

–>

Happy Valentines Day

Sorry I haven’t been posting lately. Been really busy. I promise I’ll get back on schedule with it soon. Until then enjoy your Valentines day!

–>

I Was Reading This BlackHat Forum

I was reading this blackhat forum today, cus I enjoy that kind of stuff, I supose. It was funny. They were talking about new ideas for css spamming. Some were talking javascript. Some were talking about hiding the text behind a picture. All sorts of witty shit that probably took them hours of devising. I just can’t help but laugh and wonder if any of them have ever heard of a marquee.

Look at me hide a bunch of text from the search engines. I can make this as small and hidden as I want. The best part is the search engines actually read this as normal text.

It’s funny how some of these people work so hard to let their spam go undetected using the latest and greatest technologies in webdesign. Yet none of them ever thought about using simple html 3.0 commands that can’t be banned or penalized. Although I’m not a black hatter I enjoy black hat discussion because it’s so much more creative and inventive than the boring repetative white hat stuff. Perhaps Blue Hat will become the non-boring white hat some day?

That will never happen with posts like this

–>

Don’t Follow The Herd

From reading the title you already know what this article is meant to convince you of. So I will keep it short and spare you the rhetoric.

Search engine rankings are a competition. The people who follow the standard promotion techniques inevitably will always lose. This applies especially to common free promotion techniques. This perhaps became the most transparent with Google’s decision to devalue what was coined the DMOZ affect. In short the DMOZ affect was the sites that were listed in Google’s ability to gain a good page rank and inbound links solely because of the large number sites that mirror dmoz. At this point Google has even gone so far as to stop crawling once they realize the directory is a mirror of DMOZ. The same practice has even been noticed on MSN and Yahoo seems to be in the process of following suit.The question now is how far will this go? At the moment people are loving the benefits of article writing. I personally believe this is the next on the major search engines chopping block. Will the search engines start devaluing links coming from article directories? Maybe, in the mean time here’s a good philosophy to go by.

Do what everone else does and then some more. This is the only true way to be competative. By regularly reading BlueHatSEO.com you’re at least at a good starting point to moving beyond the herd. I wish you all the success you deserve.

–>

Blue Hat Technique #6-Content For Your Website

With the new trend of article sharing there has been raised concerns about duplicate content penalties.  So now that everyone is in a fuss about whether or not to use widely spread articles on article submission sites fears begin to rise as to how search engines will react to this. Well, consider this, the author probably submitted the article to a 100+ article directories. Plus, we’ll assume the article was actually used by a good 10 websites a month. By end of a Google update(3 or so months) thats 130 sites that have the exact same article as you do on your website.

This is bad for several reasons.

1) Your site stands much smaller chances of ranking for small phrases and keyword that aren’t used much, but bring good traffic simply because many articles = lots of potential phrases for people to search for. No matter how small the phrase that fits your article you still have 130 sites competing for it.

2) Possible duplicate content penalties. These are a gray area. None of the research I found has been able to accurately explain if or how search engines penalize a site for having the exact same content as another site.

3) You have to link to the authors site and sometimes the article directory you received it from. Search engines then know you are not the original author and they rank the article directories high and the author’s site highest. You get lost in the middle.

Question of the day: How do we get expert content that is truly fresh and not going to be used on any other sites?

This is so stupidly obvious I can’t believe it’s not already ragingly popular. Ask an expert! Can’t find one? Try using chatrooms like IRC. It’s easy to find a chatroom on almost any subject, and chances are if there are people in it. Not only are they bored, but they truly enjoy talking about the subject at hand. Ask them a long pointed question, that will induce a rant. I’ll give an example.  What dogs are most impressive in a dog show and why? A question of this manner will induce a huge rant and probably a debate. With a little editing and cleaning up you have yourself an unique expert article on subject.

This tactic is advantageous for several reasons.

1) The content is completely unique. You will be the first to have it. Chat room content generally doesn’t get posted on websites or forums. No one will have content like this.

2) With a little fact checking it will be useful and reliable content for your visitors. This helps if you already know a little bit about your field.

3) It’s MUCH easier than writing your own articles.

–>

Matt Cutts On Webmaster Radio

For those of you who missed the show Matt Cutts gave on Webmaster Radio. You missed a pretty fun chat. I think I actually had almost as much fun in the chat room as Matt Cutts did on the air. Exception of course would be the one or two pissed off webmasters who were upset because their site was outranked by some spam pages. Infact one stood out as being insistently persistent on spamming the chatroom to find out “is it worth the effort to report the spam pages that out rank me?

T Minus 5 Minutes

For those of you who aren’t aware Matt Cutts will be doing a live interview on Webmaster Radio in five minutes. I’ll try to post some of the transcript after the show for those of you who missed it.

–>