SEO Checklist for E-Commerce Sites

Answering a question on Wickedfire here.

If you own an Ecommerce site and don’t know where to begin on the SEO go through this check list. In total, it’ll cost less than $500.

1. Signup with all related forums. Put your site in the footer links and go through answering product related questions on a weekly basis.

2. Create a free OSCommerce and Zencart template related to your parent niche (if you sell CDs make a music template), insert your link on it and distribute it on templatedirectories and their repositories.

3. Create an articles section on your site and put in a form allowing people to submit articles. Email hobby blogs in your niche asking to use some of their particularly good posts in exchange for a link back in the article. This will make them aware of your site and they might even link to you in future posts when talking about a particular product.

4. Steal your competitors articles and product reviews and do article distribution.

5. Create a blog on the site and give out manufacturer coupon codes regularly. This will sometimes help with getting negative results. Post those coupons on item #1.

6. Put all your products in Google Products (froogle/base). This will sometimes help with getting negative results.

7. Browse Google Products for small ecom sites with no reviews and similar products and link exchange on a separate automated link exchange script on a separate page.

8. Make sure you optimize your onsite seo. I assume you know how to do this.

9. Download, convert to html, and attach all the product manuals to each individual product. Link back to the product on each manual. This will give you more pages for indexing and catch a lot more longtail keywords.

10. Spam the fuck out of Yahoo answers and similar.

11. Directory submit! It may not work well for other sites of yours but ecommerce sites are almost always welcome in directories.

12. Customize a nifty and unique toy style item with your logo on it and mail it to the most popular bloggers in your niche. Shirts and hats also work well.

13. If you have access to the products get a webcam and pretend to be a vlogger. Review the products and post them on all the major video sites.

14. Create autoblogs and link wheels.

There’s more but I think that’ll keep you busy enough for now

EDIT:There was some confusion in the comments on what I meant by “Negative Results”“negative results” or “negative rankings” are the results inside of the regular results that Google puts in.Such as:Video ResultsImage ResultsNews ResultsProduct ResultsBlog ResultsThey used to always appear above the regular results so we call them negative rankings because they’re less than #1. Now they tend to go between random positions. This term may change the older this article gets.

–>

How To Overthrow A Wikipedia Result

A busy ranking artist runs into this problem quite often. I ran into it again the other day and figured I might as well show my Blue Hat peeps how to overcome the same problem since its a fairly popular problem to have and there is a simple solution to it.

The ProblemYour site is holding a particular rank and a Wikipedia page is ranked right above it. The specific ranks don’t particularly matter, but much like Hillary Clinton in the primaries you can’t possibly live being beaten like that. You have to drop the Wikipage down a notch and you have to continue moving up.

The Simple SolutionThe simplicity of this tactic actually depends very heavily on the Wikipedia entry. Either way they’re all very beatable, but some are easier than others. In fact as mentioned I just ran into this problem recently and I managed to knock the competitive Wikipage entirely out of the top 20 in just two days using these steps. First you need to understand why the Wikipage ranks. Most of these pages rank for 3 reasons.

1) The domain authority of Wikipedia.org.

2) Innerlinking amongst other Wikipedia entries boosting the page’s value. <- Particularly the *See Also’s

3) Inbound links from most typically blogs and forums. <- An observant person would not only notice the high percentage of links from blogs/forums in contrast to other types of links but a strong lack of sitewide links from any of those sites.

You obviously can’t do anything about the domain authority of Wikipedia.org but understand that it’s pages are like a tripod; If you knock out one of the legs the whole thing falls (pun). Well now that you understand why it’s there right up above you like a towering fugly friend of the girl you’re trying to hit on the solution becomes obvious. Knock out reasons two and three.

Steps1) Using your favorite link analysis tool (I prefer the simplistic Yahoo Site Explorer) find all the pages that link to the particular wikipedia entry that come from the wikipedia.org domain.

2) Go to each listing and find the reference to the offending Wikipage. You’ll find most of them in the See Also section or linked throughout the article. This is where the simplicity that I was talking about before comes into play. Listings such as “Flash Games” or “Election News” are easier because they’re so irrelevant. When people are searching Google for terms such as these they’re obviously wanting to find actual flash games or election news, not some faggy Wikipedia page saying what they are. The same concept applies to other Wikipages linking to them. Just because the author put the text Cat Food in the article or the See Also doesn’t mean its a relevant reference to the subject matter.

3) SLOWLY remove nearly all those bitches! Be sure to leave a good convincing reason for the removal on the editing reason. Remove as many as possible but strictly limit yourself. I understand Blue Hatters have a tendency to overdo things but you’re just going to fuck yourself if you quickly go through each and every reference and mass delete them. If you don’t know how many you should remove, then keep it to no more than 1-2 a day. Remove the references with the highest pagerank first if you got a ranking emergency and switch IPs between each one. This will either knock out one of it’s legs or at least cripple the leg a bit. Which leaves you with my match and exceed philosophy.

4) Find all the blogs and forums that link to that Wikipage and go drop a link in as many of them as you can. Match and exceed. I’m not going to dive into the nofollow talk on this one or talk about the benefits of links via blog comments. Just realize your goal in this instance isn’t to get more links it’s to get your link on the same pages that link to the Wikipage. As mentioned above you’ll be dealing mostly with blogs and forums, you’re in the same niche as the topics they’re talking about obviously and you probably won’t have any sitewide links to deal with so you won’t have to go through any link begging pains.

5) Try to drop your link into the article. This is common sense.

Side NoteWikipedia’s domain authority isn’t something Ý0µ should be entirely worried abouṪ. They’re site and µrl structure actually ßÊcomes favorable to help deaden some of the heightening factors.

OH FYI! There is now a Printer Friendly link on every post on Blue Hat by popular demand

–>

Open Questions #4 – Diminishing Values On Outbound Links

I somehow missed this question from the Open Questions post and I can’t help but answer it.

From Adsenser

I loved your SEO empire post.But I was wondering how much effect does a lot of links from a lot of indexed pages from the same domain have?I always thought that the search engines looked mainly at the number of different domain linking to you.Can you give some more info on this?Or do you use these pages to link to a lot of different domains?

This is a fantastic opener for a conversation on sitewide outbound links affects on other sites as well as the site itself. Which has been long debated but never cleared up, not because its too complicated just because theres so many myths its hard to work the fact from the fiction. To be clear in my answer I’m going to refer to the site giving the link as the “host site” and the site receiving the link as the “target site.” Just so I don’t have to play around with too much terminology.

The entire explanation of why sitewide links, main page links, subpage links, and reciprocal links work is based off a simple SEO law called Diminishing Values. It basically states that for every link whether it be recipricol, innerlink, or outbound link there is some form of consequence. Also, for every inbound link, innerlink accepted or reciprocal link there is a benefit.

-{}-0-19-{}-Diminishing Values = sum(benefits) > sum(consequences)

The need for the sum of the benefits to be greater than the sum of the consequences is essential because, as mentioned in my SEO Empire post there can’t be a negative relevancy for a site in relationship to a term. For example lets take the niche of cars. There’s a theoretical mass of car blogs. For the sake of the example we’ll say there are several thousand blogs on the subject of cars. Something in the industry happens that stirs all the bloggers such as SEMA having a car show or something. So all these car blogs blog about SEMA’s new car show coming out and give it a link. If these outbound links caused a consequence greater or equal to the valued benefit given to SEMA than all these blogs would drop in value as per the topic, cars. Thus the mass affect would be that of a negative relevancy, therefore sites with no relevancy but contain topic links would by all theory rank higher than the general census of on topic sites.

So the notion of an outbound link diminishing your sites value in equal proportion is just complete bubkiss and obviously not the way things actually work. Even if it was true and there was a compensation for on site SEO when an event in a niche happens the site hosting the event wouldn’t just rise in the rankings it would propel everyone else downwards causing more turbulence in the SERPS than what happens in actuality with just their site rising. It’s just simple SEO Theory 101, but sadly a lot of people believe it. There’s also a lot of sites that absolutely won’t link to any sites within their topic in fear that their rankings will suddenly plummet the moment they do. They’re under the greedy impression that they’re somehow hording their link value and that is in some way benefiting them. So with the assumption that an outbound link gives much more value to its target than it diminishes from its host everything in a sense balances out and outbound links become much less scary. This of course in no way says that the consequence to the host is a diminishment of any sort. It’s entire consequence could be 0 or as a lot of other people believe +X (some people think on topic outbound links actually adds to your sites relevancy). I haven’t personally seen one of my sites go up in rank after adding an outbound link but I’m open to the idea or to the future of the concept being reality.

I Practice What I PreachThe Law of Diminishing Values is one of the reasons why BlueHatSEO is one of the only SEO blogs that has all dofollow comments as well as top commentators plugin on every page. Your comments will not hurt my rankings..I’ll say that one more time Your comments will not hurt my rankings. Whewww I feel better

Back To The QuestionBefore we get into the meat of the question we’ll take a small scale example that we should all know the answer to.Q: If a host site writes an automated link exchange script that automatically does thousands of link exchanges and puts those links on a single subpage and all the target sites also have their link exchange page setup the same way on a subpage. Will the host site gain in value?

A: I’ll tell you straight up from personal experience. Yes it does. It’s simple to test if you don’t believe me go for it yourself

Now we’ll move up to a much larger scale with a specific on topic example using sitewide links.Q: If you own two 100k+ page lyric sites with lots of inbound links and very good indexing ratios, will putting a sitewide link to the other site on both raise both in value or keep them both the same?

A: Also from my personal experience, yes both will not only raise in value but they will skyrocket in value by in the upwards of 50% which can result in much higher rankings. Likewise this example can be done with any niche and any two large sites. Cross promote them with sitewide links between the two and see what happens. The results shouldn’t be surprising.

Now, on the large scale to the meat of the question.Q: If these two lyrics site cross compared all their inbound links from other sites and managed to get all the sites that link to lyric site A to also link to lyric site B to the point at which each increased in links by 100k (same as the number of increased links would of been with a sitewide link between the two) would both sites increase in value more-so than if they did the sitewide link instead?

A: Yes absolutely. This is a bit harder to test, but if you’ve been building an SEO Empire and each site’s inbound links are from your own sites than it becomes quite a bit easier to test and I’m certain you’ll find the results the same as I did.

ConclusionOn a 1:1 ratio on a generalized population of relevant links vs non-relevant inbound links from separate domains/sites are still more effective than a sidewide link of the same magnitude. However! A sitewide link does benefit both sites to a very high degree. Just not to the degree that lots of other sites can accomplish.

Sorry that question took so long to answer. I didn’t just want to give you a blank and blunt answer. I wanted to actually answer it with logic and a reasoning that hopefully leads to an understanding of the ever so important WHY.

–>

Quick Answers #2 – The Word of The Day Is Class-Cunt IP

Now that I put the dreaded C-word in the title mine won’t be the only office in the nation calling it Class-Cunt ips. Watch, you’ll catch yourself doing it and frankly you deserve it. To make the transition into a technopotty mouth easier with a handy mnemonic: A Big Cunt Drowns Easier (E is incase we ever make that switch the government keeps rambling on about).

I probably get more questions about my distribution of IPs than any other type. Frankly I can answer it in one word, evenly. But once again hitting up our Open Questions post here’s a question that I think best illustrates the topic.

This one is from Quinton Figueroa

1. For each domain do you split your subdomains up in multiple C Class IPs or do they all stay on 1? Does it depend?

2. For each domain do you link from your subdomains to other subdomains or do you keep each one as its own stand alone “site”?

3. Do you set up in the 100’s of subdomains or in the 1,000’s of subdomains (or maybe more) per domain?

Appreciate the help man, you kick ass!

Google doesn’t penalize a site because of the other sites on the same IP or class. I say this with confidence because even though Matt Cutts publicly said it in one of his video dialogs I still researched it myself to make damn sure (you can thank me later ionhosting). I also haven’t seen any evidence that the other search engines are any different. So I speak the same answer whether I’m talking about one site having a different IP than another or a subdomain having a different IP than the main domain. It’s all under the same point of reference, but to address the question directly what’s the one primary reason why a subdomain has a different IP than a main domain? Thats right, it’s on a different server.

Side TrackBTW when people say a statement like, “I haven’t seen any evidence” it usually means they haven’t LOOKED at any evidence. For future reference, give statements like that about as much authority as a one legged security officer. Do your own research.

Back On TrackIf there is no penalty for sites being on the IP and there is no explicit reward for being on separate IPs than all thats left is two small benefits of 1. If your sites are black hat it makes it harder to track all them down. 2. The links appear to be more natural between two sites if they are on separate IPs (whether or not this is an actual benefit or not remains to be seen). So whole IP diversification business boils down to costs vs financial reward. So while in the past I’ve been very cautious of my own IP dispersement, which was only in part because during that period I was able to acquire IPs very cost efficiently, since I have lessened my efforts. The rewards vs the costs just aren’t there enough to invest any worry into the matter. So my answer is simply “evenly.” Use what you got. If you get a server and it gives you 10 free ips. Use them all and just distribute your sites amongst them. You won’t regret it and at the same time you wouldn’t see any explicit benefits from dumping a bunch of extra money every month into more ips. The money is obviously better spent on things thats make more revenue such as domains and servers. Even if you had unlimited IPs how would you end up distributing them? Evenly…

To be perfectly clear, even though I take IP distribution with a grain of salt it doesn’t mean I take nameserver distribution lightly and the same applies to domain registration info. In fact I’d say the one exception to the IP carefree rule is if you happen to write a blog teaching people how to bend over Google like a Japanese whore. I mention it, because I know some of you do. In which case be very careful about what sites you allow others to see. Throwing a few decoys out also doesn’t hurt because “do no evil” policies don’t apply to profit risks. Paranoia? For a year and a half yes, after Oct 21st of this year. No. You may not get it, but someone somewhere just shit their pants. So feel free to giggle anyways.

As for questions 2 and 3 if you would of asked me a year ago I would of had a completely different response. Yet the basic principle still remains. I talked about this topic to great depth in my SEO Empire Part 1 post. Reread the section where I talk about the One Way Street Theory. The decision on how many subdomains as well as whether or not they should be orphan subdomains or innerlinked is a decision I make by asking whether or not those subdomains would be of benefit to the main domain. If they are of a benefit to it than i establish a relationship between the two (ie a link either one way or exchanged). If they aren’t than I keep the subdomains orphan. BTW the term Orphan subdomain or Orphan Subpage was a term coined by an obnoxious troll here. I kinda liked it so I kept it. It means the subdomain has no relationship with the main domain or any other pages or subdomains of the site. Watch out for innerlinking between subdomains though. Think in terms of sites who do it effectively and sites that don’t. If your innerlinking in a way that mimics About.com or similar than great. If your innerlinking in a way that say Blog Solution or something would, for the sake of link building to each subdomain, I’d advise against it for footprint reasons and for god sakes if you’re hosting a blackhat generated site on a white hat domain don’t even consider it!

Do’s and Don’ts of Subdomains.Do create subdomains for the purpose of exploiting an established domains domain authority. – I’ve talked a lot about software related sites. I think they’re a great and easy way to build domain authority. Anything related can be thrown into a subdomain. I got a couple general sites that have great domain authority and anything i throw up on it does well in the SERPS almost instantly. I make sure to not over do it and it works out very well for me.

Don’t create subdomains to save on domain costs. – It’s less than ten dollars a year for fuck sake. Don’t risk trashing a $20/day site and its authority that it took you a year or two to establish to save $10/year.

–>

SEO Empire – Part 1

Podcast Versions:

Printer Friendly: Part 1

This is exactly how I make money online…

This blog has a lot of great tips and techniques to help the average webmaster break beyond their barriers. However they are nothing more than skillsets. Skillsets are worthless without direction. For that reason before I’m done with the missions I want for this hobby (blog) I want to lay down 4 corner stone strategy posts. This is the second behind my SERP Domination post which taught the power behind numbers. As mentioned in my Log Link Matching article every technique on this blog interconnects like a well connected puzzle and fits together perfectly to form an ultimate SEO strategy. This is that strategy. In that spirit every post before this one builds up to this post and every post after is a follow-up to it. By now you hopefully have had time to browse through the archives and digest all the past posts. This will give you the necessary skillset and more importantly mindset to put all this into practice. I’ve always preached that there is no rules in SEO only loosely enforced guidelines. So it’s time to take the Jalape

How To Dupe Content And Get Away With It

Let’s do one more post about content. First, consider Google’s Webmaster Blog’s post dispelling common duplicate content myths as a prerequisite read. Do I always trust what Google’s public relations tells me? Absolutely not, but it does confirm my own long standing protests against people who perpetuated the paranoia about duplicate content. So it makes a good point of reference.

The most common myth ensue with the paranoia is, “anytime I use content that is published somewhere else, I am sure to fall victim to duplicate content penalties.” This of course is bunk because for any specific terms related to an article you can show me I can find you 9 other sites that ranks for its terms that aren’t necessarily supplemental and full of penalties. However there is no doubt that there really is a duplicate content penalty. So we’ll discuss ways around it. One of my favorite parroted phrases is, “It’s not what content you use. It’s how you use it.” So we’ll start with formatting.

Here Is Some Spammy TextWelcome to spam textville. This is just a bunch of spammy text. Text to fill and spam the streets. This is horrible spam text filled content that will surely get my spam site banned. Spam spam spam. It’s not food it’s text. Spammy text. I copied this spam text all over my site and others are copying it for their spammy text sites. I can’t believe I’m keyword stuffing for the words spammy text.

Alone in paragraph form this text is very easy to detect as spam and being autogenned. So the classic SEO ideology of “well written article style paragraphed text does well” gets thrown out the window with this example. However, since I would love nothing more than to rank for the term “Spammy Text” and this is all the content available to me I have to abandon the idea of keyword stuffing and find some new formats to put this text in that search engines will find acceptable.

How about an Ordered List?

  • Welcome to spam textville.
  • This is just a bunch of spammy text.
  • Text to fill and spam the streets.
  • Lists and bulleted points work very well because the text enclosed is meant to be very choppy, short, and contain repetition such as My goals are, The plan is, Do this..etc. etc. If the common ordered list is formatted as such, than we by all right can do the same.

    What about presenting it as user contributed?

    Comments (3)

    John Doe:Spam spam spam.

    Jane Doe:I copied this spam text all over my site and others are copying it for their spammy text sites.

    John Deer:Spammy text.

    How many of you readers have left complete crap in my comments? I’m not banned or penalized yet. Faking user contributed material works great because since it’s outcome is unpredictable therefore you can do virtually anything with it and get away. Including but not limited to inner linking.

    Mary Jane:I saw this wicked article about this on Eli’s blog subdomainspam.spammydomain.com/spammysubpage.html check it out!Break It Up Into Headings

    Heading 1

    Welcome to spam textville. This is just a bunch of spammy text.


    Heading 2

    Text to fill and spam the streets. Spam spam spam. It’s not food it’s text.

    All the keywords are there its just no longer spammy because its been broken up properly into nice little paraphrases. Once again, standardized = acceptable.

    Change The FormatWhat about PDF’s? They may not support contextual ads very well but they most certain can cointain affiliate links. The engines also tend to be quite lenient on them and redundant text. For more information read my Document Links post.

    Let’s Move OnSo now that we can format our text to avoid the penalties what if we attempt to side step them all together? I talked about how to swap titles out using a thesaurus and IMDB data in my Desert Scraping post. So I won’t talk too much about it, but definitely consider doing some research on exploiting LSI in this matter.

    How about scraping the right content?Heavily syndicated content works well for duping and it has the added bonus of being exclusively high quality. For instance I sometimes like to snag shit from the good ol’ AP. It’s not the smartest legal move but seriously, who’s going to deeply investigate and do anything about it? In such an event its always an option to just remove the article upon receipt of the CDC letter.

    All in all, theres plenty you can do to dupe content and get away with it. It’s a pretty open game and theres a lot out there.

    Have Fun

    –>

    How To Build Your Own SQUIRT

    LOL, be truthful. Did you honestly see this post coming? People wanted to know how the tool works, but I think I can do you all one better. I’ll explain in detail how exactly it works and how to build one for yourself so you can have your very own, hell one to sell if you wanted. Would you guess there’s a demand for one? Haha Sure why not? I can’t think of a single good reason why I shouldn’t (I never considered money a good enough reason to not help people). However I would like to ask a small favor of you first. Please go to RobsTool.com and subscribe. Throughout this month we are adding a new section called “The Lab” inside the tool where we are going to be hosting a multitude of crazy and wacky SEO tools that you’ve probably never thought could exist. Even if you don’t have a membership please subscribe anyways so you can get some cool ideas for tools to build yourself. That out of the way, lets begin.

    The PremiseSQUIRT works off of a very, very simple premise. Over the span of the months you promote your websites from their infancy to well aged mongers, you make dozens of small decisions daily. All the tool does is mimic as many of those decisions as possible and boil them down to a small yes or no; true or false; boolean expression. This is just very basic AI (Artificial Intelligence) and decision making based on some data. There really is nothing complex about it. Think about the first time you promoted a website in the search engines. You looked at what you considered some factors in ranking. You saw what your competitors had that you didn’t. What was your first initial reaction? You likely thought, “how can I get what they have?” A script can’t do this of course. This is a human reaction that can’t be duplicated by a machine. However, now think about the second, fifth, tenth website you’ve promoted. Once again you looked at what your competitors had that you didn’t. From there you may have noticed your mindset changed from “how can I get it” to something like, “how did I get it before?” This a machine can do! Experience just made the difference between a decision that needs to be made and a predefined decision that sets an orchestrated path. I know this all seems overwhelming, but I assure you its all really, really simple. The trick is, just build a tool that would do whatever you would do, based on the current situation. The situation of course can be defined by what we all know and study everyday of our professional SEM lives, Search Engine Factors. So the best place to begin is there.

    A List Of FactorsSince the tool will make its decisions based on stuff you consider to be factors search engines use to rank your sites, making a list of all the known factors is a big help. Sit and write down every search engine factor you can think of. Break them down to specifics. Don’t just write “links.” Write Link Volume, Link quality, links on unique domains, percentage of links with my anchor text..etc. The SQUIRT utility I released looks at 60 separate factors. So at least you have a general goal to shoot for. Come up with as many factors as possible. Once you got a good clean list of factors start figuring out a proper way to measure each of them.

    Factor MeasurementHow many times today did you go to a search engine and type in link:www.domain.com? That is a measure of a factor. How about site:www.domain.com? Thats another. Each of those are a factor that when explored by either going to your own site, or going to the search engines can result in some sort of figure or number you can use to calculate how your site fairs in comparison to the sites currently ranking. Let’s use an example. You go to google and you search for your keywords that you are wanting to rank for. You make a list of all the sites in the top 10 and separately do a link: command for each of their domains. You then take all those figures and average them out. That gives you a rough idea of how much “link volume” you will need to get into the top 10. You then do a link: command on your own site to see how close your site is to that figure. From there you can make a decision. Do I need to work on increasing my link volume factor or not? You just made a boolean decision based on a single factor using data available to you. It probably took you a good 5 minutes or more to make that decision. Where as a script could of made that decision for you in less than a second. Now I know you’re all just as much of a computer nerd as I am, so I don’t have to preach to you about the differences in efficiency between yourself and a machine, but at least think about the time you would compound making these very simple decisions for each and every factor on your list for each site you own. There goes a good five hours of your work day just making the predictable yes or no decisions on what needs to be done. This sounds ridiculous of course, but I’d be willing to bet that at least 90% of the people reading this post right now spend most of their time doing just that. Ever wonder why most search marketers just trudge along and can’t seem to get anywhere? Now you know.

    Making The DecisionsOkay so let’s take an example of a factor and have our script make a decision based on it. We’ll look at the anchor text saturation factor. We look at our inbound links and find all the ones that contain our anchor text versus the ones that don’t and only contain some similar words somewhere else on the page(most other documents). We then make a percentage. So we’ll say after looking at it 30% of our inbound links contain our exact anchor text. We then look at our competition. They seem to average 40%. Therefore our script needs to follow a promotional plan that increases our percentage of links that contain our exact anchor text. Very good, we’ll move along. Next we’ll look at inbound links that don’t contain our anchor text but contain our keywords somewhere in the document. Looking at our site we seem to average about 70%. Our competition seems to average about 60%. So we are doing much better than our competition. Therefore our script doesn’t need to increase our links that doesn’t contain our exact anchor text but do have relevant text. Wait, did I just contradict myself? These two factors are complimentary. So the more our tool increases one factor the further the other one drops. Wouldn’t this throw our promotion through some sort of infinite loop? Yes I did contradict myself and Yes it would put our promotion through an infinite loop. This is called on going promotion. The fact is THEY rank YOU don’t. Therefore you have to keep improving the factors you lack until you do rank; even if they seem to almost contradict each other. By the end of the analysis your script ends up with a long list of DO’s and a long list of DON’T NEED AT THIS TIME. So now all you have to do is use your own experience and your own site network to make all the DOs happen to the best of it’s abilities.

    Establishing A Promotional PlanSo now that we have a list of all the stuff we need to improve with our site we can program our SQUIRT script to just simply do whatever it is we would do to compensate for our site’s shortfalls. To give you a better idea of how to do this and how SQUIRT handles these exact situations, I’ll take 3 example factors and let you know exactly what it does when you hit that submit button. However keep in mind, no matter how much information you gather on each site, every promotional situation is unique and requires a certain amount of human touch. The only thing you can do is define what you would do in the situation if you had no prior knowledge of the site or any extenuating circumstances. Also keep in mind that you have to remain completely hands off. You don’t have ftp access to their site, you can’t mess with their design. So anything you do has to be completely offsite SEO. Also, anything you do can’t hurt the site in anyway. Every plan needs to be 100% focused on building, and any method of promotion that may possibly cause problems for them, even if you plan on only running throw away black hat sites through the tool, needs to be 100% positive. So if you want to go get links. You need to do it within your own network of sites. You can’t go out sending spam comments or anything.

    Page RankYour Site: PR 2AVG Competitor: PR 3Decision: Increase Page RankThe Plan: Create a network of very large sites. Since pagerank can be gathered internally just as easily as from external sources. Than you need to build a network of sites with lots and lots of indexed pages. Take a look at the current volume of sites you plan on running through your SQUIRT tool and decide how big you need to build your network before hand. When we decided to make SQUIRT public, even though not all the sites would require a Page Rank increase we knew a TON would. We launched the tool with the capability of handling 500 members. So we knew that 500 members, submitting 10 sites/day with each link needing to hold on a single page for at least a week, could result in needing 150,000 links available to us each week. If each link was on a page with a PR 1 than each page would send a tiny page rank increase to the target link. Likewise if each indexed page had a PR1 and we put five links up on each page, than each page would give out even more page rank through the links. There is a point of saturation of course. We decided 10 was good for each page. So we could get the maximum amount of pagerank sucked out of each indexed page while maintaining the highest possible volume of links we could spare. So if we built a network of sites that contained a total of about 1,000,000 pages indexed(you heard me), each averaging a PR1. Than we could transfer enough page rank to 150,000 links/week to constitute a possible bump in page rank to each link. For your own personal SQUIRT of course you don’t need nearly that volume. However make sure to preplan ahead because even if you make a 1 million page site, it doesn’t mean you will get 1 million pages in the index. So may have to build quite a few very large sites to reach your goals. This of course takes time and lots of resources.

    Anchor TextYour Site: 30%AVG Competitor: 40%Decision: We need to increase the number of links that contain our exact anchor text.The Plan: Now since the tool can’t directly affect the site than we can’t exactly go to every inbound link the site has gotten and figure out a way to get them to change the anchor text. There are just too many ways to gain links and its completely unreasonable to attempt. So the only way to increase the anchor text match percentage is to increase the total number of links to the site and then have them all match the anchor text. This is where you need to queue up the blogs. All you have to do is create a bunch of blogs all over, and take steps to increase the chances of each individual post getting indexed. Than you can fill the blogs with the anchor text of the site. This however is no easy task when dealing with a large volume. Since you need plenty of authority you will need to get accounts on all the major blog networks. Remember, the average blog usually has less than 5 posts/day so you will need to compensate in sheer volume of actual accounts. These will also need to be maintained and anytime one gets banned another needs to be automatically created to take it’s place. Those of you using the tool have probably already noticed links from these places showing up in your logs and Technorati links. Since we knew so many links/day would be required we had to create an absolutely HUGE network of blogs on various places as well as the automated system to create new ones if another gets buried. Once these links have been dispersed over time, the anchor text percentage will start to rise.

    Deep IndexingYour site: 3 pages in the index.Crawl Stats: 10+ subpages identifiedDecision: Need to get more bots to the subpages of the submitted site.The Plan: So the script grabs the main page of the site and immediately sees more subpages available than are currently indexed in the search engines. This lets us know that the submitted site is having a hard time being properly deep indexed. So this is when we queue the Roll Over Sites to help get some SE bots to these subpages. There is one problem however. When dealing with my own sites it’s fine to scrape the content then redirect as detailed in the strategy that I talked about in the Power Indexing post. However since this tool will be a public one I can’t scrape peoples content because there is the odd chance that my rollover site may temporarily outrank their actual site and it could draw some anger from people who don’t know what it is. Remember the rule. The tool can’t harm the site in any way. So we had to go another route. Instead we pulled from a huge list of articles and just used that as the content then pushed the spiders to those pages then when they got indexed, redirected them to the site. These of course don’t show up as links so it’s all backend, however it does a fantastic job of getting subpages indexed. Since Rollover Sites use actual content than there is no problem making them absolutely huge. Therefore you don’t need a very large network of them to push a very large amount of bots. Yet at the same time you still have to follow the rule of no interference with their site. So if their site is having a hard time getting deep indexed and you can’t exactly ftp a sitemap over than you have to bring in a large network of Third Party Rolling Sitemaps. So what you do is, you create a network of sites that are essentially nothing more than generic sitemaps. You drive a lot of bot traffic to them on a regular basis and have them roll through pages that go through the tool. Once the tool has identified up to 10 subpages of the target site than it can add them to the network of third party sitemaps. The new pages go in, old pages go out(The technical term for this is a FIFO algorithm). If you have a solid enough network of third party sitemaps you can hopefully push enough search engine crawlers through them to get every page crawled before it gets pushed out. This of course is a huge problem when dealing with a large volume of sites. If we originally wanted enough power for 500 members than we knew that 500 members submitting 10 sites/day which each contained up to 10 subpages would mean we would need enough third party sitemaps to accommodate 50,000 pages/day. While the efficiency of a single third party sitemap site may be big, a massive network would be needed to push that kind of volume. It’s just beyond unreasonable. So instead, we incorporated a gapping system. So anytime there wasn’t a new page coming in and it had a chance to display a set of links to a SE bot for the second time than it would grab some older entries that were already rolled through and display them as well. So if you push enough crawl traffic through than eventually every page will theoretically get crawled.

    Rinse and RepeatSo thats all there is to it. It’s really quite simple. As you build your SQUIRT work your way through every factor you wrote down and think about what you would do as a hands off Internet Marketer to compensate if a site didn’t meet the requirements for that particular factor. This also applies to flaws in onsite SEO. Let’s say for instance the page doesn’t have the keywords in the title tag. You can’t touch that title tag, so whats the offsite SEO equivalent to not having the keywords in the title tag? Putting them in the anchor text of course. Just keep relentlessly plugging away and build an established plan for each factor on your list. With each failure to meet a factor there is potentially a problem. With each problem there is a potential solution your script and personal site network can take care of for you. It just may require a bit of thinking and a whole lot of building. It’s tough, trust me I know but it’s worth it because in the end you end up with a tool that comes close to automatically solving most of the daily problems that plague you. Here’s the exciting part. Once you start building and going through all the factors you’ll find some you probably can’t solve. Just remember, every problem has a solution. So you may just learn a few tricks and secrets of your own along the way. Shhh don’t share them.

    Is SQUIRT Biased?ABSOLUTELY! Think about what it uses to determine the site’s standing on it’s strengths and weaknesses. If you do a link: command there is a delay in time between what the search engine shows and what actually exists that may range up to days and weeks. This causes major problems. The tool may think you need a bunch of links when it in fact you already got plenty and they just haven’t shown up yet. It may think you have a ton of links when those links were actually only temporary or you lost them before they had a chance to disappear. Essentially the tool operates off of your site’s potential SEO worth. So lets say you have an old site that you haven’t really done anything with, then you run it through your SQUIRT. The tool will stand a much better chance of making an accurate analysis, and likewise any boosts you receive in the factors will more than likely be the right ones. Therefore it will appear as if your site just skyrocketed up simply because of one little submit through the tool. When in fact the site had that sort of potential energy the whole time and it just needed a little shove to get moving. The same could be said about an established site experiencing extremely slow growth. It fairs well, maybe even in the 60%+ range, so it appears to not need very much. Then at the same time, everything the tool does do, matters very little in the large scheme of your escalated promotional campaign. Also, the tool can only focus on one site during one phase in it’s promotion at a time. So if you got a brand new site that you submit, than the tool will naturally focus more heavily on getting the site properly indexed and less on helping it gain rank through the SERPS. So if indexing methodologies don’t completely work in that particular case than it appears as if your tool didn’t do anything at all. Remember, this is only a tool, its all artificial decision making. There is no substitution for the human touch. No matter how complex or how efficient a tool you build is, there is no way it can compete without an actual human element helping to push it. All your competitors are humans. So there’s no logical reason why you can expect to ever build a tool that is the end all solution to beating them everytime. So even though you now have a really cool tool, still remember to be hardworking, smart, and efficient…never lazy.

    Sorry about the long as hell post, but you guys wanted to know what the tool was doing when you clicked the button. Now you do, and hey…least I didn’t go through ALL 60 factors.

    Get Building

    –>

    Real SEO Example

    Every so often I get an Email from someone who instead of having a question or comment they give me a url and ask me to give them SEO advice on it. I normally don’t respond to these emails other than maybe a quick “did you have a specific question” type response. It’s not because I don’t want to or I take any offense to it, in fact its the complete opposite. Sorry to say I just don’t have the time to do full blown SEO analysis jobs for people. It’s not that I don’t want to, trust me I do, its just that I’m not one of those SEO bloggers whose career is blogging about SEO. I’m in the thick of it just like you guys are; day in and day out. I’m out there doing the techniques I talk about on this site everyday. Thats my job, and there’s no shortage of it. However, every once in awhile I’ll hit a gem. One of those unique situations that I can’t help but mole it around in my mind. One such example came to me last week by a reader here who was referred to by Jon Waraas(excellent blog; check out), and I think it could apply to a lot of other people in the same situation. He was kind and patient enough to allow me to publicize my response in an effort to help out others. So out of the usual context I’m going to take this example and show you all exactly what I would do in his situation.

    The SituationHis site is going for the term “Myspace Layouts.” He already used to rank #1 for a solid amount of time. However after some time he admittedly got lazy and lost his ranking. He now still ranks in the top 10 steadily, but not quite top 5. He wants to rank number one again and was wondering if implementing my Keyword Fluffing and SERP Domination strategies would help.

    I consider “Myspace Layouts” as a highly competitive term so this will be a great example. It’s also has the added difficulty factor of being an extremely fast growing niche; especially in just the last year. However, our mission isn’t just to take the number one rank, but to keep it intimidating for the others thinking about trying to take it themselves. So first we’ll look at what we got and then we’ll analyze what we’re up against and see if we can spot any weaknesses we can use to our advantage.

    What We GotWithout even having to look at his site I can assume his on-site optimization is near perfect. He used to rank #1 steadily and where he’s at now he’s holding firm. Obviously, if he ranked at one point there’s no reason why he can’t rank again. So my first suggestion would be to not do a damn thing to the site itself. In my SERP Domination post I talked about how to break into the top 10 on a highly competitive niche by splitting up the site into a network of smaller content sites. In this situation he’s already in and has everything he needs to rank in the top position, so I would definitely advice against breaking up the site or making any drastic on-site SEO changes. Instead we’ll focus on off-site optimization and getting what links we need to earn that coveted spot again. Right now he has about 320,000 inbound links according to the Yahoo linkdomain: command. About 303,000 go directly to the main page. ~86,000 of his links come directly from Myspace Profiles. He also has a PR6.

    What They GotHis top competitor has about 551,000 inbound links. 344,000 come directly from Myspace Profiles. Taking a quick look at the second and third placed competitors they are slightly less than the number one and theres nothing too notable about them. So for now we’re going to say fuck ‘em and not concern ourselves with what they’re doing because we know if we take the #1 site we’ll beat them as well. That after all is our main target and without extenuating circumstances they can be ignored in this case. So with that information out of the way for now, lets look at some strong determining factors and some weaknesses of both sites in respect to their rankings.

    Spotting The Weaknesses and LoopholesLets analyze the math real quick. Without Myspace profile links he has 234,000 links on his own from other sites. His competitor has about 207,000. He is the clear winner in this instance which gives us a huge advantage. However, his 86,000 links directly from people’s Myspace profiles account for only about 27% of his inbound links. While his competitor clearly dominates by having his 344,000 Myspace profile links account for a whopping 62% of his total inbound links. We now have both a strength and a weakness for each of the sites. More importantly we now know why we’re loosing. Thus, we know what loopholes we’ll need to exploit in order to win.

    What Do We know?By looking at both our site and the competition we know that if we can increase our links from people’s Myspace profiles in the index to at least 62% than we will win. That means we’ll have to gain about 112,000 links on Myspace Profiles. We also know that our competitor is getting these profile links the same way we are. By giving out free Myspace layouts that include a link to our sites. There is a big problem standing our way though. Since they are ranked #1 and we’re ranked lower than we can assume they are getting more traffic than we are. Therefore, they are gaining these profile links faster than we are by giving out more layouts. It’s quite the pickle. We’ll have to do one of two things. We’ll either have to increase the value of each profile link or we’ll have to increase our volume at a much much faster rate. Neither sound like a fun solution, so we’re going to have to take a few shortcuts. One of which includes a twist on a technique I’ve already talked about, the other is an upcoming Blue Hat Technique I’m yet to discuss. First and foremost we need the proper mindset. We can’t just try to beat this guy, we need to brutally destroy him and get him so far below us that he’ll be stuck in the same situation we are in now. It’s the only way to maintain our competitive edge. So get any thoughts of playing nice out of your head right now. They aren’t going to do you any good.

    Establishing A Plan Of AttackFirst we’ll need to do an interesting twist on my Link Laundering technique and merge it with our SERP Domination tactic. Remember when I mentioned in one of the comments that all the techniques I’ve talked about on this site fit together perfectly like a puzzle to create an ultimate ranking strategy? I wasn’t fucking around. There’s no reason why you can’t create a network of sites within subniches that launder targeted links to your main site. He already mentioned in his original email that he wanted to create a few sites based on Myspace subniches to help build link popularity to his main site. Except, knowing what we know now, he doesn’t need more links from outside sites. He needs more links from Myspace Profiles. So I’d recommend doing just that. Build several sites on Myspace subniches such as image uploaders, profile editors, webproxies..etc. However, anytime they have an opportunity to get a link on the persons profile, instead put in the url to your main site. Looking at the current scenario he’s going to need quite a few of these sites in order to catch up, I’d say about 15 that perform on the average or even a little below average. It also wouldn’t hurt to sponsor a few of these types of sites in exchange for them laundering out links to you. I know places like Myspace Web Proxies have a hard time finding and keeping converting ads on their sites, they could be a very cheap solution. Just as long as they are consistently getting your main site links on peoples profiles thats our ultimate goal.

    You got to understand, this guy only has 344,000 links on profiles. Thats not a big deal. If his site has been up for the last two years thats only about 15,000/day, not counting a sharp increase from when he started ranking of course. Getting 15 subniche sites within your network getting about 1,000 uses/day isn’t too incredibly hard. You can get that within a month or two. Once you accomplish that, you are at least matching him which is progress in the right direction.

    Matching him? I’m sorry I fucked up. At that rate you’re no where NEAR matching him. There is the inevitable law of diminishing returns standing in your way. There are over 100 million Myspace profiles, only about 10 million exist in Google right now. Therefore less than 10% actually gets indexed. Since it’s only the ones that get indexed that matters to your rank than that means even if you match the 15,000/day; You’re really only getting 1,500 that matter. That’s not nearly enough. Consider your ass kicked. Or is it? We may need the help of a good Ol’ Blue Hat Technique to bridge the gap and turn that 10% into nearly 100%. The technique is called Log Link Matching. I’m not going to go into intricate detail about it until the post comes out of course, but I will hint to it and explain as it applies to this case.

    Blue Hat Technique #XX – Log Link MatchingAnyone with experience can attest that not nearly all of the true links to your sites actually gets indexed and count in the search engines. It’s unfortunate but true. Not every page is indexed by the engines such as Google and therefore even if they link to you, they can’t pass a value. This is especially true for social sites like Myspace where only about 10% of the profiles get indexed. Log Link Matching is a technique used to ensure that nearly 100% of your real inbound links gets actually indexed by the search engines, thusdramatically increasing your visible inbound links and bumping your rank in an indescribable way. Here’s how it works.

    First create an automated Roll Over Site using my Abandoned WordPress Accounts Part 2 post. Did I mention all my techniques merge together? Well they do. Do this by first building up several WordPress blogs into high PR blogs by doing the Digg.com method as described. This will get plenty of search engine spiders to the site on a steady and fast stream as well as give plenty of authority to the site(to boost each site). Lastly, make a system that allows you to quickly add links to your Blogrolls within these accounts (quit being lazy and learn how to code!).

    After you got your WordPress.com blogs setup start parsing through your site’s log files. The log files are the first indicator of someone linking to your site. Everytime someone pulls an image on their Myspace account from your server it will show up in there, regardless of whether or not their profile actually shows up in the engines. Where there is a profile pulling your images there is more than likely a link to your site. So you must get that profile page indexed in the search engines in order for the link to count. So parse through your log file for any referrers coming from profile.myspace.com or www.myspace.com/*. Remember profile.myspace.com/viewprofile.blahblahblah?friendid=blahblah is a page on its own just like www.myspace.com/myusername. Therefore BOTH count. So there’s no reason why our site can’t instantly almost double all of his links from Myspace profiles at the exact same time as getting nearly 100% of them indexed. All he has to do is start adding each of these referrers to his blogrolls. After so many WordPress.com will automatically limit how many links actually show up and start randomizing which ones are shown on each and every pageview(or spider view). I think that limit is 15 currently but I’m not entirely sure. It seems to change every so often. Either way every time Googlebot visits the rollover sites it’ll be greeted by XX amount of Myspace profiles to index. Eventually it’ll get almost all of them. Thus the illusion of our inbound links tripling and even quadrupling daily will start happening. We won’t actually be performing nearly to the degree it appears, but we definitely won’t be short any needed momentum either. It’ll just appear that way because many links we’ve had for a long time will finally start showing up and giving us proper credit.

    If we keep parsing the log files and checking for new people using our templates and ensuring they all get indexed eventually we will catch up to that #1 site. Coincide that with our link laundering sites within our network we should have no problem overtaking him and holding our ground. Once we are at that point, its check and mate for the time being. He’s either going to have to top that performance or back off and accept his second place trophy. If he does manage to pull something sneaky and come back, no big deal, persistence is worth more than gold.

    I hope this little real life example helped a few people in the same situation, and I’ll move up the date on that Blue Hat Log Link Matching Technique so you can get the details of it and really learn how to utilize it in some very powerful ways. It never hurts to have more in our arsenal. Just remember, the fact that you already have more inbound links from other Myspace layout related sites than him. If you can just match him on his strong points you can beat him.

    Go Get ‘Em Tiger!

    –>

    The Parable Of The White Hat & The Black Hat

    When linking here, for some reason people keep describing this blog’s content as grey and black hat. I’m flattered but I actually couldn’t disagree more. If anything I’m trying to make some serious noise to the white hats and encouraging them to learn some new tricks, but most seem to be covering their ears with enthusiasm. La la la la I can’t hear you! On that note there’s an interesting long-term debate going on disputing whats better, black hat or white hat? Did I say interesting? I meant boring and completely bullshit. Saying you like black hat better doesn’t make you a black hat. Likewise, showing your disgust for black hats without learning it doesn’t even make you a white hat. In my opinion until you take the time to learn and develop both skills to the fullest extent there’s only one classification for you….Amateur.

    So with that I’m going to attempt to make the ultimate white hat post by telling the ever famous parable of The White Hatter and The Black Hatter. You may have already heard it. If you’re not familiar, a parable is a short story, often fictional, told to illustrate a lesson or morale. If that still doesn’t ring any bells then you should read the Bible more you heathen. There are several in there. I’m kidding of course. I respect your religion now matter what it is. and Yes, there will be real techniques you can use hidden within the story.

    The Black Hatter And The White Hatter – When Two Pros MeetOnce there was a white hatter. He had a great website that ranked #1 in a competitive niche. He had many fans of his site and it got lots of search engine traffic from his primary keyword. Suddenly one week while checking up on his site he noticed another high quality site in his niche moving up quickly in the ranks. It was also a very nice site with lots of value. The White Hatter didn’t think much of it because of his own solid rankings but knew he better watch this site more closely due to it’s upward momentum. Suddenly one day after a small SERP update the site he was watching moved into the #2 slot right under him. This started to make him nervous because he knew the differences in income between the number one and number two slots. He has gotten very comfortable in the number one position and had no intentions of giving it up.

    Once there was a black hatter. He created a few Made For Adsense sites across a couple hundred niches. They were fairly uniform. Some performed well, some performed poorly. By chance of fate one of these sites was in the White Hat’s niche. Since this was a competitive niche his site quickly caught the attention of the lower positioned sites that were still struggling amongst themselves for the #2 slot. The black hatter’s site was doing fairly well in this niche and making a couple bucks a day. It wasn’t ranking for any major terms within the niche, but since the niche was such a good one it was still bringing him some solid residual and he was very happy. The rest of the site owners within the niche, angered by his intrusion, quickly took action against his black hat site. After a few legal threats and a bunch of complaints to everyone possible they were finally successful and got the black hatter’s site taken down and banned. Once entered and realizing the potential of the niche the Black Hatter reluctantly took the site down and built a clean site for competition within the niche. There’s no point in letting such a great niche go he thought. So he built a very clean and high quality site and aimed it directly at the most competitive keyword. He built up the site nicely and quickly. While the other lower sites within the niche fought amongst themselves and spent their time combating the endless supply of spam entering their niche he focused on building the link count required for the number one position. It wasn’t very long before he managed to grab the number 2 slot. Things were going well except that number one site in his way was clearly going to be a force to be reckoned with. He was going to have to pull off something slick.

    By this time the site had the full attention of the White Hatter. He started watching the inbound links and site content of the Black Hatter’s site intently. It seemed fairly even. In fact this site even managed to get many of the same link spots he had as well as a few new ones. The site had lost its momentum but still a worthy adversary for the top spot. The White Hatter watched in dismay as his site and the other bounced back and forth between the top two positions. He continued building links and working on his site. Suddenly without notice, the other site took the top position and it stuck. The opponent kept it and wasn’t budging. The White Hatter had to figure out why and quickly. Both sites had a solid number of links at about 45k-50k and almost all were at least relevant. He started investigating all the inbound links and finally one day found something very odd. This other site had about 10,000 new inbound links from random Blogspot accounts.

    The Black Hatter knew he wasn’t going to take down this monstrous number one site by playing clean. He had to do something drastic. So he whipped up some scripts and grabbed a list of the top 1,000 or so keywords for the niche and created some Blogspot accounts accordingly. He populated them from a popular RSS Aggregator and made sure each page had a link to his main money site. With the added link popularity he easily took the number one position and wasn’t budging. He didn’t want these Splogs to get quickly banned and knew later on he would need their link age so he did the responsible black hat thing and gave each post credit to the original and left all the ads out. He also knew not to continuously create too many and draw attention. He had to keep his numbers just high enough to gain and maintain his position and stop it there and just consistently update each blog once it’s established. After all the big money was in his big clean money site and he knew it.

    The White Hatter definitely received a big blow to his business and had no intentions of taking it lying down. He created a crawler and using the footprints within the Blogspot templates started compiling a list of all the accounts made. He located a large portion of the 10,000 accounts and started investigating where all their content was coming from. They were clearly feed scrapes. So he took all the post titles and scanned common RSS Aggregators. He found it! They were coming from Google Blog Search. The White Hatter was smart. He knew if he started panicking and throwing a fit and trying to get Blogspot and the search engines to delete and ban all these accounts it would do him no good. The other site could easily generate them faster than he could ever get them deleted. It was clearly a futile effort and he knew it. Time for a workable plan. So he had to hit the Blogspot accounts where it hurts. He used his crawler to scrape all the titles of almost all the scraped posts created. The White Hatter then concocted a script to ping Google Blog search with all the same post titles as was in his list. He injected links to his site within the article content as well as put himself as the source. Unfortunately, the scraper was smart enough to remove all the html from the feeds but he still got to keep the original link within each and every post. He knew that if each of these post titles managed to get grabbed by the Black Hatters scraping script than they would surely get snagged again once all the blogs got updated and he fed them into the aggregator. It worked. All the new posts on all the spammy Blogspot accounts now had a link to his site. This evened the playing field. Whenever the Black Hatter’s site got a link, his site got a link. However this wasn’t acceptable, the black hat site still had all the previous post links and was barely beating him. Something had to get done about that. All these links had to be devalued. So the White Hatter opened up his Askimet filter logs and started scanning for domains that were banned in the engines. Once he managed to find a couple hundred so he started slowly feeding them to the Blogspot accounts through the Google Blog Search feed. He kept the same post titles knowing they would get scraped yet again and he made a balanced mixture of putting his site as the source of the post as well as the banned domains that way they would both get links.

    By this time the Black Hatter, with his position secure, had already moved on to his next niche. Suddenly one day he noticed a large drop in revenue. His site had lost the number one position and was back in the battle for the first two spots. “What the hell is going on?!” he thought. He looked at his Blogspot scripts to find the source of the problem. He quickly noticed that not only was his accounts giving his main competitor links but they weren’t worth a shit because he was also linking to a bunch of banned pharmaceutical sites, putting every Splog he created into bad neighborhoods. His efforts were worthless and he knew why and more importantly how. It was time to face a tough decision. He could either endlessly combat this guy, who obviously knew his stuff, to keep his current revenue or he could move on and continue to focus on new niches and creating new revenue for himself. He couldn’t help but laugh about it. So he emailed the White Hat and expressed his respect for the competitive exchange. There was no point in furthering it and they both decided to call it quits and just exchange links on the main page to help lock in their positions and let the algorithms decide from there who was better. After considering it a draw they both went their separate ways.

    The Moral Of The StoryInterpret it however you want but recognize the fact that the White Hat defended himself. He didn’t just roll over or waste his time throwing a fit trying to get the inevitably endless supply of black hat sites banned or deleted. He knew his opponents tricks and thus was well equipped to combat them. He stood his moral stance and did what he had to, in order to protect his business. No matter where you stand on the issue you can respect that.

    If you hear anyone whining and crying about black hats or white hats please politely explain to them that there are no black or white hats. Only amateurs, experts, and people who are willing to learn how to protect and grow their businesses then send them this story.

    –>

    Keyword Real Estate

    Hey guys and gals. Incase you didn’t notice I took a short break from posting to catch up on some work. Thanks for hanging out. Let’s review. In my SERP Domination post I spilled the beans about creating a site network by breaking down a larger site to take down the competition in a very efficient manner. This of course works beautifully in reverse, but we’ll save that for another day. In the mean time though I want to touch on a small portion of that post I could of easily divulged a hell of a lot more on. I’m talking about the “secondary network.” Incase you have short term memory loss I’ll refresh ya with a quote from the post.

    [quote]Blah blah blah, sexist joke. Blah blah snarky remark. I’ll create a much larger secondary network to help boost my blah blah inbound link authority. Blah blah ramble ramble.[/quote]

    In words longer than short I introduced the well proven concept of Keyword Real Estate. Keyword Real Estate isn’t just a concept its a practice. In fact scratch that. Keyword Real Estate is the law of high rankings just as sure as Murphy’s law will make your servers go down between 1-3am instead of 9-5pm. Practicing good keyword real estate snatching will help boost your sites. I will go as far as to say that it is unwise to enter any niche no matter how uncompetitive without grabbing up as much real estate related to your terms as possible. I’ll explain.

    Domains, Free Hosting, Blog Networks, Social Networks, Social Bookmarking, URL Shorteners. The list goes on and on. Any where there is an authoritative domain that allows you to own a static page there is no reason why you shouldn’t register it and put up a landing page advertising your site(s), or even a simple link if that’s all that is possible. I’ll explain this in a simplistic white hat way using an analogy because this is in no gray or black hat about it, its just common sense business.

    Imagine you live in a big city and you setup a small business. The local phone book directories want you to pay them to put a small business card sized ad in the yellowpages/category/keyword/page/adspot. Okay so it fits within your ROI so you do it. It’s almost manditory for a small business to be found by it’s customers. However, at the same time, all over the city there are these giant sky scrapers with all these blank billboards on them. They are giving them out for free all you got to do is claim them first before your competitors do and you can put up any ad for your business you want within reason(TOS). Also, in other portions of this fictional city are little chunks of land that you can grab and put up a giant sign with a big ass arrow pointing to your business. They may not be in the primo business districts but they at least get drive by traffic. I don’t know about you but I’m too ethical of a business owner to take advantage.

    Ya know on second thought, I think I will jump in on this offer. So let’s look at a few examples of these billboards and opportunities for Keyword Real Estate and where to easily find them. They are everywhere.

    A few examples might include:

    keyword.Wordpress.comkeyword.blogspot.comkeyword.blogger.comdel.icio.us/keywordborntobuzz.com/keyworddigg.com/users/keywordtechnorati.com/profile/keywordbloglines.com/blog/keywordkeyword.typepad.comastore.amazon.com/keyword-20myspace.com/keywordsquidoo.com/keywordkeyword.spaces.live.comsomeforum.com/members/keyword.html <-vbulletin with vbseo installed & your link in the signature.I know it’s simple and not very advanced advice, but it’s solid.  Look at what the experts are doing. A great example of how well this works is of course SEO Contests. How are they won? In the simple sense they are won by Keyword Real Estate. They are also a great resource for finding some awesome opportunities. After all it works damn well. I personally never enter a competitive niche without grabbing as much real estate as possible on that keyword or phrase and using it to boost my sites’ rankings and traffic. It’s common sense, all you got to do is follow through religiously.

    –>