Open Questions: When To Never Do Article Submissions

Got a question in my E-Commerce SEO Checklist post from Rania, who didn’t leave me a link for credit.

“4. Steal your competitors articles and product reviews and do article distribution.”

You recommend STEALING articles from competitors as an advanced SEO tactic?! Seriously?!

How about recommending that users create their own unique content in order to increase their site traffic and ranking.

Suggesting that people steal competitors work really says a lot about you- and your company.

Good luck.

I know you’re only trying to make a point but I’ll accept the question anyway.

Why would I steal my competitors articles for article distribution instead of writing/sharing my own?

The answer is much more than a time or ethics response.

1. Unnecessary CompetitionA typical article distribution involves submitting to around 1-5k worth of article directories and E-zines. Any time you submit the same piece of content to that many sites it creates unnecessary competition. This is especially true if your site is new. The article directories and ezines are old, your site is new. They win. While they usually won’t take you out on the primary keywords especially with your site linked in the article, they can snag a lot of unknown positions in your longtail and mediumtailed phrases pushing your site down and losing you an unknown bit of long term traffic. This can go all the way up to a worse case scenario. Last year when Acai Berry was a hot niche a lot of people were seoing for the term and many did article distributions. While their sites never made it into the top 10, the articles they submitted came closer. So when Google did some manual reviews and bitch slapped a bunch of rebill promoters and affiliate pages on the term most of what was left was the articles. All ranking and for awhile there while everyone else readjusted left a bunch of article directories taking all that good traffic. The first thing the article directory owners did of course was edit the articles and take out the links to the authors websites and throw in their own affiliate links. Lesson learned, but that brings us to point two.

2. Don’t Do It Unless Others AreESPECIALLY IF YOU ARE IN A SMALL NICHE! God damn, I see people do this all the time. They find a nice little niche with nearly zero competition and in the miniscule effort they have to put into ranking they realize the only way they know how to build backlinks is to article submit. They get flooded out and cry. Uhhg. The point is, same as point one. Don’t do it unless others are. I follow the rule outlined in SEO Empire, if you want to win Match and Exceed. If it becomes a problem for you to maintain your ranking, then submit where they submit and a bit more. So don’t submit unless you have to, but if you do make sure you…

3. Submit Your Competitors Articles OnlyAll those same articles on thousands of sites creates a massive duplicate content penalty opportunity. So the worst choice you can make is to go through the articles published on your site and submit them. In the middle of the bad choices spectrum would be to write unique articles and submit those. Unique articles are for pulling in seo traffic and thus they belong on your site and preferably nowhere else. You throw little fits when people steal your content, why would you willingly give it out? The wisest choice is to submit your competitors articles because if you’re going to put anyone at risk for duplicate content you might as well put them not yourself. I’ve always said, there’s two ways to rank: You going up or them going down. I own about 170 article directories as part of my basement. I understand how the article game is played. People aren’t submitting articles to my sites because they want my directory to be the best smelling turd around. They want the links. I want the pages of content for link laundering and they want the links, that is what its all about and nothing more. No one owes me unique articles, nor is it doing them any favors to give them to me.

Hope that clears up the game rules for you Rania, sorry its not advanced, but there’s good reasons for everything we do. Please understand, I don’t tell people to be evil just because I enjoy watching them be evil (even though I do). There is always some form of risk whenever you are. Yet never hesitate to be evil when its a sink or swim situation, always match and exceed.

–>

SEO Checklist for E-Commerce Sites

Answering a question on Wickedfire here.

If you own an Ecommerce site and don’t know where to begin on the SEO go through this check list. In total, it’ll cost less than $500.

1. Signup with all related forums. Put your site in the footer links and go through answering product related questions on a weekly basis.

2. Create a free OSCommerce and Zencart template related to your parent niche (if you sell CDs make a music template), insert your link on it and distribute it on templatedirectories and their repositories.

3. Create an articles section on your site and put in a form allowing people to submit articles. Email hobby blogs in your niche asking to use some of their particularly good posts in exchange for a link back in the article. This will make them aware of your site and they might even link to you in future posts when talking about a particular product.

4. Steal your competitors articles and product reviews and do article distribution.

5. Create a blog on the site and give out manufacturer coupon codes regularly. This will sometimes help with getting negative results. Post those coupons on item #1.

6. Put all your products in Google Products (froogle/base). This will sometimes help with getting negative results.

7. Browse Google Products for small ecom sites with no reviews and similar products and link exchange on a separate automated link exchange script on a separate page.

8. Make sure you optimize your onsite seo. I assume you know how to do this.

9. Download, convert to html, and attach all the product manuals to each individual product. Link back to the product on each manual. This will give you more pages for indexing and catch a lot more longtail keywords.

10. Spam the fuck out of Yahoo answers and similar.

11. Directory submit! It may not work well for other sites of yours but ecommerce sites are almost always welcome in directories.

12. Customize a nifty and unique toy style item with your logo on it and mail it to the most popular bloggers in your niche. Shirts and hats also work well.

13. If you have access to the products get a webcam and pretend to be a vlogger. Review the products and post them on all the major video sites.

14. Create autoblogs and link wheels.

There’s more but I think that’ll keep you busy enough for now

EDIT:There was some confusion in the comments on what I meant by “Negative Results”“negative results” or “negative rankings” are the results inside of the regular results that Google puts in.Such as:Video ResultsImage ResultsNews ResultsProduct ResultsBlog ResultsThey used to always appear above the regular results so we call them negative rankings because they’re less than #1. Now they tend to go between random positions. This term may change the older this article gets.

–>

How To Take Down A Competitors Website: Legally

They stole your articles didn’t they?You didn’t even know until they outranked you.

They jacked your $50 lander without a single thought to how you’d feel?Insensitive pricks

They violated your salescopy with synonyms.Probably didn’t even use a rubber.

They rank #8 and you rank #9 on EVERY KEYWORD!bastards!

Listen, why don’t you just relax. Have a seat over there and enjoy an Otterpop. You obviously Googled this article for a reason. No one reads this blog anymore. I haven’t even updated since November or so I hear. Literally, Someone had to tell me I haven’t updated since November. I didn’t check so I’m just going to assume they wouldn’t bullshit me. The point is you have anger issues and would like to turn those anger issues into an anger problem. That’s cool I’ll show ya how, but first lose your initital instinct to go troll a forum asking about ways to take down a competitors website which inevitably leads to trying to find a hacker. Don’t be stupid illegal shit will ruin you eventually. Instead join the vigilante team to be a real jerk about it.

How To Take Down A Competitor’s Site The Asshole WayFirst create a free account on www.spamcop.net. Spamcop.net is a large anti-email spam service that I hear is secretly owned by Spamhaus (another free antispam service). It allows very bored antispam vigilantes (”spamcops”) to anonymously report email spam. From there they track down the host and server provider, tear out all the contact info so they can’t trace it back to you, and send them a report claiming you’ve been sending email spam. The fortunate and unfortunate thing about spamcop.net is they are very reputable despite having no method of verifying any of the reports are real. Hosts/server providers take their reports very seriously and often times after 3-4 reports in a short period will disconnect the targets server’s IP and a few more after that cancel their server keeping all their files. It really is a guilty until proven innocent scenario. The only way to win if someone is reporting fake spam against you is to really be a spammer and just shut down and move to a new host. If you’re a legit webmaster you’re pretty much fucked to say the least because they’ll just keep telling you to stop doing what you’re not doing and if they’re really patient they’ll ask you to prove you’re no longer doing what you were never doing. LOL

DMCA Notices Don’t Have Shit On Spam ReportsOnce you create your account and login go to the Report Spam tab…

*disclaimer*I’M NOT ACTUALLY SAYING TO DO THIS. I’M SAYING IF YOU WERE TO DO IT THIS IS WHAT YOU WOULD DO BECAUSE THAT IS WHAT OTHERS ARE DOING. I AM AN ENLIGHTENER TO WHAT ALREADY HAPPENS EVERY DAY. I DON’T WANT TO BE SUED BY THESE ASSHOLES. LOL*/disclaimer*

Copy and paste a few random spam emails currently in your inbox. Swap out the domains and ips (they even give you neat tools for this!) and send away.If you do this every day for a week or two their server will eventually go offline. For them it becomes an uphill battle to keep from getting shut down.

This definitely isn’t a cool thing to do and isn’t in the spirit of competition nor is it affective against large sites like Wikipedia or Yahoo(DANG!). Likewise I wouldn’t recommend doing it, but unfortunately the tactic exists and is in play already. The problem is with reputation. Spamhaus and Spamcop.net has a reputation as never being abused, but that’s only because no one ever bothers to check to see if it is abuse giving them a near perfect record. True recent story: I can’t say if my server really did send out spam or not originally because I just had no way of knowing, but I got this spam complaint. It was nearly worthless just had one of my domains as the links in the spammail and my server IP. I couldn’t find the problem so I told the server provider. That wasn’t acceptable with them so I made something up, saying umm yeah it was a sendmail hack or something. I fixed it. I fortunately could tell from some hidden code in the email that got skipped by the censors who the spam complaint came from. I thought it was over with but unfortunately I got complaints from him every day for about a week and a half. I fended them off till they eventually shut off my IPs with no warning and wouldn’t even give me access to the server to fix the problem. Haha oddly enough they said they wouldn’t turn it back on until I did fix the problem..idk. Anyways I said screw it and let the server stay down for two weeks. I come back and the guy is still sending spam complaints. Each one is dated for the current date. Basically he was spinning these emails and claiming my server was spamming him for two weeks without the server even being on. By the time I could even point this out it was too late, the servers and everything on them was pretty much dead.

THIS IS THE REALITY! THIS IS WHAT HAPPENS.

So I definitely don’t feel bad if this information gets abused because it may eventually lead to a solution to a major problem that causes a lot of damage to the rest of us.

–>

Addon Domain Spamming With WordPress and Any Other CMS

I got this question from Primal in regards to my post on Building Mininets

Eli,

I like the post and your entire site. Thanks for sharing your knowledge. One thing confuses me about this particular tactic. Where are you getting the content from? You mentioned Audioscrobbler and Youtube API but I am not focusing on a music niche. The “widely available car db” sounds more like something I could use. Can you say where you would get something like this from? Also is there any reason why I should use customized pages instead of a CMS like WordPress to generate these kinds of sites?

Upon a glancing read this question seems to focus too much on the exact example used in the post. Yet if you really read the multipart question thoroughly and get to its core it’s a FANTASTIC question that really needs an answer in more depth than what I would put in a comment response. The mininet building post isn’t about how to use APIs and RSS feeds to get content. Nor is it about creating a custom CMS or doing multiple installs of the same site structure (I covered that in depth in my SEO Empire post and called them ANT Scripts). The real down to brass tax gem behind the technique is understanding how to do Addon Domain Spam via environmental variables such as HTTP_HOST to create a lot of sites from a single install of ANYTHING. I’m absolutely a firm believer that addon domain spam is the future of webspam. Subdomains had their day and now its time for figuring out creative ways to create a ton of unique sites from a single platform. This doesn’t always have to be done through addon domains and as mentioned in the comments can be done through other ways such as editing the httpd.config. For now though I wanted to focus on the basics such as using addon domains and if you’d like to go cheap about it subdomains, and let the SEO ingenuity naturally evolve from there.

To answer your question yes you can use databases to help with the content for these sites. Check out my Madlib Sites post for some great ideas on how to accomplish that and use databases. As for the second part YES you can use other CMS’ such as WordPress!

How To Use WordPress To Do Addon Domain SpamI got several emails from people asking how to create a wordpress plugin to accomplish this technique as well as a comment from the longtime reader PhatJ. I realize at first thought this sounds like a complicated process to be able to convert wordpress over to being able to read multiple addon domains and treat them as multiple installs and probably require some sort of plugin being created, but as with most things the simple solution is often the best.

The easiest and most effective way to convert any CMS to be used for addon domains that I’ve found is to simply edit the config files. No joke, that’s seriously usually all it ever takes. In my wordpress wp-config.php file I grabbed the line that declared the database:

define(’DB_NAME’, ‘database1′);

I replaced it with a simple IF ELSE statement to check for the domain and define the appropriate database:

if ( $_SERVER["HTTP_HOST"] == 'domain1.com' ) {define('DB_NAME', 'database1');}elseif($_SERVER["HTTP_HOST"] == 'domain2.com'){define('DB_NAME', 'database2');}else {define('DB_NAME', 'database1');}

Then I just pull each database in the browser or mass wordpress installer script and setup each blog as if it was separate.

To show you it in action I put up a single WordPress install on a subdomain on Bluehat. I then added a second database and put that code into the wp-config.php. Looking at each you’d have no idea they were a single wordpress install. See for yourself

Domain 1: http://addtest1.bluehatseo.comDomain 2: http://addtest2.bluehatseo.com

Thanks for your question Primal!

–>

Blue Hat Technique #21 – Advanced Mininet Building

I promised awhile back that I’d teach you ugly bitches more ways to build your sexy SEO Empire. With some spare time this week I might as well take some time to help your nasty hooker ass do just that. YES I will insult you through this entire post because judging from the recent comments you donkey fuckers are getting a lil too big for your own britches and need to be brought down a peg. I’m kidding of course. You guys are great. I just feel like filling this post full of as many reasons not to read it as possible and since no one gave me an excuse to do it, I just made one up. This post will be advanced and since this technique’s ability to be bulletproof feeds off creativity and the subtleties of being selfmade I’ll also only give out pseudo code instead of code samples. It is however an extremely efficient way to build large amounts of unique and self-promoting sites and is more than reusable for just about any chunk of niches so modularize your code and save it for future scaling. Trust me you’ll wish you did.

Getting Started With Your Custom Mininet GeneratorIt’s always easiest to start a project with an example in mind so begin by picking a generalized niche that can involve a lot of different sites along the same theme. For this example I’ll use music based fan sites. So I’ll grab a few starter domains to test with such as AudioslaveFanzSite.com MetallicaFanzSite.com KanyeWestFanzSite.com and maybe one more for good measure, JonasBrothersFanzSite.com. <- See how I was all insulting and mean at the beginning of this post and suddenly changed it around to being all considerate and using faggy bands as examples so you fags can relate to what I’m saying here. I’m not mean all the time and can in fact be quite understanding. *grin* Anyways! Now that you got your domains setup an account under a single domain on your server and add the rest as addon domains. In your site’s root make a Sources folder and another to hold a data dump. After that setup a MYSQL database to be used by all sites and put a row in a table for a single domain you bought (the rest will be inserted automatically later). I recommend you put the actual domain in some value in that row.

Build a Single Universal TemplateThis is easier than it sounds. You can always go 100% custom but to save time I like grabbing a generic looking premade template. I then put it into a script and disect the html to put in as many variables as I can fit. A few examples would be

$heading1

$maincontent

which I will later fill with the actual content and variable values. Pack the template full of as many customizations as you can so it will not only be flexible and universal among all topics in the niche but the html itself is very random and as uncookie cutter like as you can get it. Torwards the end of the process I also tend to throw in a bunch of $randspacing type variables as possible. Then i use a randomizing function to create various spacing and line returns and randomly insert it throughout just about every group of html tags. I mention this now instead of later in the post because its important to realize that you will want this template to be as flexable as possible because you’ll eventually be using that same template on a TON of sites that may or may not be doing some interlinking so you don’t want it to appear as a network. Changing colors, widths, and images around are a great way to accomplish this just don’t get too complicated with it starting out. Keep it very basic and once you got the mininet nearly done you can add as many as you’d like later. Sometimes it’s typical to throw yourself off focus and doom the project by getting too hung up on getting the same thing perfect. For each variable you place in the template you’ll want to put the same as a field in the SQL table you created previously.

Putting Together Some Content SourcesFor an example such as the music fan sites mininet I’d probably jot down a few sources of content such as Audioscrobbler for the band descriptions, primary image, and discography. Then Youtube API for a few music videos for each musician. Another great source would be Yahoo Images for some band related wallpapers and Alexa for some related sites to link to. I might even grab the Google Blogsearch rss for some recent blog posts related to that artist. Starting out it’s usually best to keep your sources as simplistic as possible and not stray too far from readily available RSS and APIs. Like I said you can always get more advanced and custom later. Create a module script for each source and put it in your previously created Sources folder. Then for each source you came up with add it as a table in your SQL and put in all the fields you’ll need for each one and remember to save room to identify the domain on each one.

Building The GeneratorCreate a backend script that will simply be a place to copy and paste a list of the domains and their primary keywords into with a button to submit it. My domains and keywords for this example would most likely be pipe delimited such as:GodsmackFanzSite.com|God SmackU2FanzSite.com|U2BeyonceFanzSite.com|Beyonce KnowlesOnce the list is submitted the generator needs to insert a new row into the table and create all the randomized variables for the site such as the background colors , various spacings (and/or a brand new template file stored in the data folder) putting them in the same single row. Once the basics are done it can call all the source modules and run them using the domain name and the keywords they need to grab the right content. They will then need to put that content into the database under the proper domain’s row(s). You now have all the content you need for each site and each got its own template! Now it’s time to just build the son of a bitch.

BUT! Before I show you how I’ll give you a few examples of how I would setup my tables for each of the sources so you can get a better idea.For my Youtube I’d probably keep it simple and just do the domain and the embed code.Domain|EmbedCode

Audioscrobbler:Domain|BandDescription|Albums|PrimaryImage

YahooImagesDomain|PathToImage

GoogleBlogSearchID|Domain|PostTitle|PostDescription|PostLink

AlexaDomain|RelatedSite1|MySite1|RelatedSite2|MySite2|RelatedSite3|MySite3|MoneySite1

*the MySite1 would be another random fan site in your list of domains. The MoneySite1 would be a money site of yours you can insert later to help with upward linking These are foundation sites after all.

So simple even a retarded piss bucket like yourself can figure it out

Scripting The SitesI know some of you are going to talk about dedicated IPs for each site and various other expensive ways to make sure the sites don’t get associated with each other but there was a good reason I said to use addon domains although there are other more complicated and better solutions. The first thing you should do when scripting the index page is to grab the current domain using an environmental variable such as HTTP_HOST. Once you have the domain name you can use that to grab all the appropriate data for each domain name and you only have to code one site and get it to work for ALL the sites in the mininet. For instance if someone goes to JayZFanzSite.com it’ll grab that into the variable and customize the entire site to be a Jay-Z fan site even though its all the same script controlling all the addon domains. I always start with the main page and branch all my subpages off that. For instance for the Jayzfanzsite.com I would put in a section for Jay-Z Music Videos and link to More Jay-Z Music Videos(the Jay-Z being that domains primary keyword as specified in the DB). That Jay-Z Music Videos subpage would just be more previously scraped music videos from youtube. The same would be done for the Jay-Z Wallpapers, Jay-Z Discography, Jay-Z Lyrics, Jay-Z Guitar Tabs..Whatever sources I’m using. Each would be a small section on the main page and would expand into their own subpage which would target popular keywords for that artist. Once all that is done and built into the template you can test each change among all the current test domains you have to make sure each shows up nicely and the randomizations and backgrounds all are static and neat for each site. Be sure to put in a place for your Alexa similar sites and as shown above mix in links to your other fan sites for each band/musician as well as some placements for your current and future money sites so they can all get good link volume. Once every test site looks pretty and is fully functional along with fairly unique content all you have to do is scale up with more domains.

BUT FIRST! I like to incorporate ways for each site to self build links. Such as for the Google Blogsearch posts I’d put a section on the main page for Jay-Z News listing the most recent 25 blog post results or so. Then I would build a small cronjob script to update it every day with 25 new posts or so and do a pingback on each to score a few unique links from other related sites every day automatically. This way you not only have lateral links from other sites on the mininet but links from other sites and the links are always growing slowly so each site can continue to grow in rank and traffic over time.

Buying More Domains and Scaling UpAs indicated I like to keep it simple and pick a prefix or suffix that has many open domains that way I don’t have to spend a ton of time picking out the domains I can just grab a list of several thousand popular bands and mass buy the domains then copy and paste them into the backend generator. Boom! Several hundred to, if you’re bold enough, thousands of new sites. All of which will grab quite a bit of underexposed traffic from keywords, image search and links. It will also give you tons of links and awesome pagerank for future sites you build. It’s a lot of work initially but it’s definitely easier then hand building all those sites yourself and the sites can easily become just as successful as if you did, especially if you did a good job with your sources. Once you’ve scaled up that mininet to a level you’re comfortable with and can maintain financially (it helps to build in a montenization module to the site so you can easily switch out ads among all the sites so they can create the most money possible per site) you can switch to a new group of sites using the same code, many of the same sources, and same method. The music fan site example is great because nearly the exact same code can be used in so many ways. For instance I can take the same damn code, get rid of the audioscrobbler and swap it for a widely available car DB for the description, image and car specs, and build a whole mininet for every single make and model car out there with a whole new set of domains such as JaguarXJ220specs.com, BMW540specs.com, PontaicGrandPrixspecs.com. It’s as easy as swapping out the keywords used in the modules so they become Pontiac Grand Prix Videos (from youtube source) and Pontiac Grand Prix Wallpapers/Images. All you need is a new template and a new group of domains to build an absolutely massive and diverse mininet that is actually profitable and self growing.

PS. I know I said HUNDREDS and THOUSANDS of sites all dramatically, but as with all techniques start off small. Get your scripts and promotion right. Make sure it works and is profitable on a per site basis before scaling up to any ridiculous levels.

LATA JERKS!

–>

Review of AutoPligg Backlink Tool

If you already own AutoPligg read this post anyways. I promise you’ll learn something…

Today’s tool review is of a tool called Autopligg by the Syndk8 crew. You’ve probably already heard of it by now but I’ve been using it for the last 6 months or so and I wanted to give you some insight on the tool and maybe some resources if you don’t already have it. Autopligg is a windows or serverside based tool that spams the popular Pligg platform, which is a CMS that basically is like Digg. There is a TON of sites out there using the PLIGG platform which makes this tool more than deserving of a review.

How Is It UsefulI use the word spam tool hesitantly in the case of Autopligg because it’s unique in the fact that yes it is a spam tool and it is automated link building but I think its real power lies in using it for white hat purposes and a way to mass post your stories to a whole lot of fuckin sites that want your posts. If you look at the typical PLIGG site it’s structured much like Digg with categories and you put up your links in those categories and people can either vote them up or down and people leave comments on it. Ya know all that bullshit you’re already familiar and tired of. You can get links one of two ways on them. You can post your links (typically article links if you don’t want them deleted) as stories or leave comments. Very few of these sites are very high PR or of a high link quality but anytime you have a platform that allows you to post links on it with well over a 100k working sites out there using it; It becomes a link builders wet dream. For that Autopligg becomes a very useful tool. Here’s where the catch 22 happens however.

The Quality Of The Links It BuildsIt’s no secret that I’ve never been a believer in nofollow and its ability to be “not followed” but I’m never very vocal and definitive about it because 10 minutes after I say something about it, it could change lol. Ain’t that a bitch. The main story links are nofollowed by the Pligg platform. So every link you post as a story will be nofollowed. HOWEVER! The link will be the top outside link excluding the site’s possible navigation. It will be in a heading tag and have your anchor text. Plus it gets its own page that will be cycled through the main page. Here’s the strange shit, the comment links are dofollow. They are only placed automagically by the platform by putting the full url including the http:// in your comment. They are dofollow but they don’t have your anchor text. They don’t cycle through the main pages and they are standard links beyond the point of the page’s navigation. By now, if you’ve been paying attention to this blog, you should already know what link type I prefer….both! But lets not loose focus on the point of this tool by worrying about the quality. It is by all uses a Link Volume building tool NOT a link quality building. It’s nice to have at least a bit of link quality with every link so they produce the link worth necessary to make them count instead of going supplementals. This brings up another SEO concept that the tool was kind enough to also address, link indexing saturation. I was really curious with all the links it builds at once if it’ll have at least something to get them indexed. It does have a pinger that also allows proxy use. Cool! Good enough for me. Between that and the outputted list of successful postings I have everything I need to get as high of a link indexing saturation rate as possible. Kudos on that.

How Is It Not UsefulI may spare no punches when it come to the negatives of tool reviews on BlueHat but to date I’ve never given a negative review of a tool. This is simply because I’ll only post the review if I think the tool is useful to you the reader. I tend to reserve that right when asked to do a review. Autopligg does have a VERY strong negative side which I was really hoping would go away in the months since it’s release that way this review can be all positive and not get a negative just because of a few issues that typically work themselves out after the first few months after it’s release. At this point, almost 8 months later, it appears this isn’t going to happen so the beef still stands. Anytime a new link building tool comes out there’s always those few people (amateurs) that want to get their “monies worth.” They use the tool in the most retarded way possible and do it as hard as they can. It typically makes the tool worthless for the first couple months. It does them no good and provides no benefit and at the same time hurts everyone else and the tool itself. Autopligg stemming from the Synkd8 crowd who is notorious, and even self proclaiming, for pulling this kind of shit wasn’t expected to be an exception. ESPECIALLY since it has a windows based version. At least with serverside versions you have to have at least a few braincells to use the program and with web based you can cancel their accounts. So this was entirely expected, but what shocked me was. It was and since the launch always has been only about 3 users of the program that have been doing this idiotic irresponsible use of the tool. Unfortunately as I mentioned above, they’ve yet to stop.

What they’ve done is, they’ve opened up like 10 instances of the program and put in a single super long comment into the comment poster that’s nothing but a ton of links to separate subdomain spam on a single site (i guess after buying the tool they couldn’t afford more domains). They have a macro restarting each submissions over and over and over so every single post on every single Pligg site instantly gets hit with a ton of worthless spammy links. The main domain has been banned for months now so instead of stopping the script they just forwarded the domain as if it was going to do them any fucking good. Especially since they forwarded the domain and all its subdomains to a single wordpress blog with some spammy text written by the worlds worst content generator and no ads or offers or particular keywords. Here let me show you what I’m talking about and you can decide if this isn’t the most retarded shit you’ve ever seen: http://www.5wing4.net/story.php?title=Cat_Urine_Removers-1. If you’re like OMG that’s my site he just outed consider this before bitching to me. If you don’t want attention don’t be an attention whore. You did it to yourself, fuck off I don’t care. The sites were worthless the moment you got put in charge of SEOing them anyways.

So please, use this tool responsibly. Even if others aren’t it’s still a very useful tool and one I would definitely recommend you have in your arsenal because like I said, it’s for link volume and deep link volume not link quality. For that reason I’ll go over the best way to use the tool and how to get the most out of it, because often times the trick to harnessing the optimal power out of a link building tool is using it correctly.

How To Use Autopligg The Right WayAs with all Blue Hat Techniques the short answer is, mimic the white hatters. Now that you know how not to use the tool I’ll show you how I used it and got a lot of success. I am actually planning on writing a guest post for this for SEOBook that’ll cover it more in depth, but for now I’ll at least introduce what I call MacroNiche Blogs and if he doesn’t end up publishing it I’ll at least post it here for yall. <- Check it out Quadzilla another new SEO concept! Quick copy paste copy paste copy paste!

MacroNiche BlogsPeople often talk about Niche Sites and MicroNiche Sites. In case you’re not familiar they’re basically talking about the size and focus (scope) of a particular site. For instance a niche site might be a site on female orgasms. It’ll have the main page which sells some product(s) related to those and it’ll have a few other articles on particular such as Gspot orgasm, Clit orgasms and various other myths. Each of those pages will also typically sell a product or an offer related to them as well. A MicroNiche Site is very similar except its more focused such as the main page being solely on the Gspot orgasm then a few supporting subpages on the same topic but all pushing the same product and offer. Since I love to confuse things I tend encompass all of these types of sites into the single “Money Site” term. This is because i technically consider every site no matter where it is in the empire’s scale a niche site because it does focus on a niche thus it is a niche site. Therefore I tend to group things on the intensity and focus of the site pushing a single offer or product. A Money Site would be at the top of that scale. Therefore a MacroNiche Site or in this case MacroNiche Blog is exactly what it implies. It’s a single blog that encompasses a very large niche but doesn’t focus on that niche. Instead it focuses on lots of much smaller niches within the single site and uses the authority being passed through the subsections to help the others so new sites for each microniche isn’t required to rank for each individual offer.

Although accurate I may have explained that in a way that was a bit more complicated then it is. Let’s use an example, an example that happens to be the very first test site I used AutoPligg on. My blog’s macro niche was Health. I then looked through a bunch of offers on some CPA networks and made a list of niches based on all the offers I found that would fit into the Health macro niche. So I setup my blog with a bunch more categories such as: Weight Loss, Hair Care, Teeth Care, Bodybuilding, Skin Care, Exercise. The list goes on and on. I was very thorough in my niches. I then put in some subcategories under each that were my microniches. These were more focused on individual offers. For instance under Weight Loss I put in Dieting, Diet Pills and under Teeth Care I put in stuff like Teeth Whitening and under Exercise I put in Exercise Equipment, Yoga blah blah.

This MacroNiche structure was the perfect test for AutoPligg because it utilized AutoPliggs most powerful attribute, it’s ability to build both deep link volume and main page link volume. Had I have only used a Niche Site or a MicroNiche Site after submitting the main page and all the subarticles I would have become immediately limited to just spamming the comments for further link volume. The ability to continuously use both the article submission for the microniches and the standard niches and the comment postings for the main page authority and link volume gives this structure a huge benefit in it’s overall domain authority and it’s ability to quickly rank new posts. So that’s exactly what I did and continue to do.

First, once I got the design and categories setup, I hired a couple writers to go through each of the categories and subcategories and schedule several months worth of daily posts making sure each category and subcategory were accounted for within the first month then subsequently each following month at least once. Once they were done I immediately did a directory submission for each primary category as well as the main page along with some social bookmarking and various other link building. As you’re all bound to ask, for my primary keyword for the main page I picked one that was fairly medium competition that would grab the attention of other health related bloggers. That way later when I got that ranking and wanted to do some blogroll link exchanges it was easy to find willing targets. Then every day I would give special attention to the individual blog posts for that day. For each one I would submit it through the story submitter in AutoPligg so it would get several thousand links plus each of those links would get pinged for some link saturation along with a few daily rounds of comment postings for that category and the main page. I would then do a quick scan for similar articles on each post and submit those through several hundred article directories with links to both the singular post and the main page as well as the subcategory with the keywords for each. I would then ping and social bookmark the individual post. After that all I had to do was find 5 new posts with the same keywords as that post and exchange links within the post to the other blogger’s post *cough* Pingcrawl *cough*. This also got the attention of other bloggers in the niches and macroniche to notice my site and be more willing to do blogroll exchanges. That caused each individual post to quickly start ranking the same day/week it was posted and brought up the total site authority with every one. Which brought me to my final step, bringing up the total site authority as much as possible via the main page, by placing links to the main page across my SEO Empire and running some automated link building tools as well as general link building such as commenting on other blogs/news sites. The fact that it was a MacroNiche Blog with legitimate articles on each subject made each post more credible and the site more credible overall so if there was a human review or in the case of directory submissions, more palatable and acceptable to the reviewer.

This is where it turned into a mutha fuckin Money Site! I could go through the offers at my leisure and write targeted salescopies pushing the offers and throw them up as I wrote them as if they were their own microniche site. Likewise when new offers would come out on untapped niches, while everyone else has to start building new sites around them and then start the link building to get them ranked I could just immediately jump in with a new post and instantly hold a new ranking using the site’s existing authority. You actually see this happen all the time when shit like Oprah’s fat ass blogs about a product. All I was doing was using Autopligg and my eye for structure to mimic that effect. There’s obviously a lot of spins and ways to build a MacroNiche Site as with any structure, but if you start out and do it right the first time you’ll quickly learn and expand very nicely. There’s also a lot of macro niches out there to take over such as Business (bizops, grants, paid surveys etc) Consumer Reports (Credit Cards, Credit Reports etc) Downloads (adware, recipe/music/games programs, etc).

PricingNow that I gave it a fairly positive review and got you started on how to have success with it (the two make a logical partner) lets talk pricing because as it stands now it’s a bit more complicated then it should be for tools I review. It’s normal for IM tools to fluctuate their prices to accommodate sales volume drops or when they say its only $5 for the first 5 people then it’s a $100 for every person after that. That’s all mind fuck marketing bullshit. The only excuse to fluctuate a price in a tool is when new versions come out (new versions with new features not new versions that simply fix old bugs for that matter) because that causes more development costs along with support and marketing costs. So in that right it’s understandable. Normally as a condition of doing a review I get assurance that the price, at least for the readers here, will not change and that there is always some sort of long standing discount. This time I’m going to break the rule just a bit. When Autopligg released it costed $289. As of writing this post it’s $189. I’m told a new version is coming out next month and the price will go back up to the $289. This is okay with me in this case for three reasons. First, I was okay with the original $289 price so as long as the flux doesn’t go back above that then I’m cool. Second, I’ve been assured that the people who buy at the $189 price will get a the new version free. There’s nothing I hate more than having to buy the same product twice. Third, allegedly the new version will have a new feature of being able to crack the recaptcha captchas used by many of the autopligg sites. This opens you up to literally tens of thousands of new targets. That in itself is a big enough feature to warrant a price change. Finally, after talking to Earl Grey (the owner) I got a coupon code that should last through the price change with the new version.

AutoPligg Purchase WebsiteUse the coupon code: BLUEHATSEO for a $45 discount.I recommend you use the Windows desktop version. It’s faster and more efficient then the serverside version. I know I know weird eh

BONUSSince lists of good pligg sites are tough to come by and the program requires it, I took the liberty of building my own list for you guys. Here’s the database of the 7,409 pligg sites I use. Although most require manual registration (till the new version comes out) all are scraped by me tested, work and are importable into the program. It should be fairly error free at the time of writing this post, but they do change daily so bear with it if the list becomes systematically worthless as time passes. The best advice I can give you is for now ignore the lists they hand out in the private forum. Most are raw lists and a colossal waste of time to import (several hours each) and only get 200-400 good ones out of.

List of AutoPligg SitesRight click Save As

–>

Conspiracy Theories Please

If you find portions of my writing style on this blog hard to understand, overly complicated, different then my writing style elsewhere or just generally outside the realm of normal blogging; It’s not because I’m trying to hide something, trick you, cover up a lack of experience, or prevent you from learning the technique. In fact I’m not blogging at all. I’m writing pseudocode.

Thanks for understanding,-Fishy Eli

–>

Open Questions – Subdomains and Main Domains

I got a great question on my SEO Empire post from Ryan at NetSEO. I figured it was worth addressing in a post rather than leaving a really long comment.

Eli,Can you be so kind and explain why this is:

“Primary domains can pass a penalty to subdomains. Subdomains can’t pass a penalty to a main domain unless the main domain holds a relation to the subdomain (ie. a link).”

Happy to answer Ryan

Anytime I make a statement like that I am usually making a reference to an exemption to the general This-Is-My-Site ->

Anytime I make a statement like that I am usually making a reference to an exemption to the general This-Is-My-Site -> This-Is-Google -> This-Is-The-Value stream of things. Sometimes I’m a bit presumptuous in assuming readers caught the reference. In this instance I’m talking about the exception given to protect free hosts from penalties, particularly those who give their users subdomains such as Hypermart, Xoom, WordPress.com, Blogger, Tripod etc. This exemption can’t only cover the popular free hosts otherwise no new freehosts would ever stand a chance. As soon as they got a single spammy user their whole site could get banned and poof goes their legit business. Likewise algorithmically it can’t cover all free hosts because then the biggens like WordPress.com and Typepad would all be penalized. On a foresight this would also include profile based social sites such as Myspace and outbound linking social sites such as Delicious. Anyone remember when Geocities sites used to rank so well. Yet at the same time with a lot of splog platforms out there manual reviews would be a nightmare and unfeasible. So there is a line drawn. That line has to consist of some sort of relationship between the primary domain and the subdomain of a site that’ll evaluate if the “subsite” belongs to the main site or if it’s a separate entity. By way of algorithms that relationship is very tough to determine. In fact it’s damn near impossible to do with 100% accuracy. Unfortunately for them, they have the burden of relying on internal linking relationships between the two which would include the above statement as well as other protective factors that would encompass other exceptions such as nonstatic links (like Furl & Delicious would use).

This area of unsureness gives us SEO peeps room to do things such as create splogs and do subdomain spam. As long as we know what they’re looking for (the antispam teams) we know what not to provide. Most sites that contain ownership of their subdomains link from their mainpage down to the subpages to the subdomains and so on and so forth. So when in doubt do the opposite. It’ll provide less of a chance for a relationship between the maindomain and subdomain to be found and if you’re worried about linkjuice get it from other sources via deeplinks.

–>

Advanced White Hat SEO Exists Damn It! – Dynamic SEO

Hello again!I’ve been restless and wanting to write this post for a very long time and I’m not going to be happy until its out. So get out your reading glasses, and I have it on good authority that every reader of this blog happens to be the kind of dirty old men that hang out and harass high school chicks at gas stations so don’t tell me you don’t have a pair. Get ‘em out and let’s begin….

Fuck, how do I intro-rant this post without getting all industry political? Basically, this post is an answer to a question asked a long time ago at some IM conference to a bunch of gurus. They asked them does advanced White Hat SEO exist? If I remember right, and this was a long time ago and probably buzzed up so forgive me, every guru said something along the lines of there is no such thing as advanced White Hat SEO. Now I’m sympathetic to the whole self promotion thing to a small degree. If your job is to build buzz around yourself you have to say things that are buzz worthy. You can’t say the obvious answer, YOU BET IT DOES AND YOU’RE RETARDED FOR ASKING! You gotta say something controversial that gets people thinking, but not something so controversial that anyone of your popularity level is going to contradict in a sensible way making your popularity appear more overrated than a cotton candy vendor at the Special Olympics. In short, yes advanced white hat exists and there’s tons of examples of it; but you already knew that and I’m going to give you such an example now. That example is called Dynamic SEO. I’ve briefly mentioned it in several posts in the past and it is by every definition simple good ol’ fashion on-site keyword/page/traffic optimizing White Hat SEO. It also happens to be very simple to execute but not so simple to understand. So I’ll start with the basics and we’ll work into building something truly badhatass.

What Is Dynamic SEO?Dynamic SEO is simply the automated no-guessing self changing way of SEOing your site over time. It is the way to get your site as close to 100% perfectly optimized as needed without ever knowing the final result AND automatically changing those results as they’re required. It’s easier done than said.

What Problems Does Dynamic SEO Address?If you’re good enough at it you can address EVERY SEO related problem with it. I am well aware that I defined it above as on-site SEO, but the reality is you can use it for every scenario; even off-site SEO. Hell SQUIRT is technically dynamic off-site SEO. Log Link Matching is even an example of advanced off-site Dynamic SEO. The problems we’re facing with this post specifically includes keyword optimization which is inclusive of keyword order, keyword selection, and even keyword pluralization.

See the problem is you. When it comes to subpages of your site you can’t possibly pick the exact best keywords for all of them and perfectly optimize the page for them. First of all keyword research tools often get the keyword order mixed up. For instance they may say “Myspace Template” is the high traffic keyword. When really it could be “Templates For Myspace”. They just excluded the common word “for” and got the order wrong because “Template Myspace” isn’t popular enough. They also removed the plural to “broad” the count. By that logic Myspace Templates may be the real keyword. Naturally if you have the intuition this is a problem you can work around manually. The problem is not only will you never be perfect on every single page but your intuition as a more advanced Internet user is often way off, especially when it comes to searching for things. Common users tend to search for what they want in a broad sense. Hell the keyword Internet gets MILLIONS of searches. Who the fuck searches for a single common word such as Internet? Your audience is who. Whereas you tend to think more linear with your queries because you have a higher understanding of how Ask Jeeves isn’t really a butler that answers questions. You just list all the keywords you think the desired results will have. For instance, “laptop battery hp7100″ instead of “batteries for a hp7100 laptop.” Dynamic SEO is a plug n play way of solving that problem automatically. Here’s how you do it.

Create A Dynamic SEO ModuleThe next site you hand code is a great opportunity to get this built and in play. You’ll want to create a single module file such as dynkeywords.pl or dynkeywords.php that you can use across all your sites and easily plug into all your future pages. If you have a dedicated server you can even setup the module file to be included (or required) on a common path that all the sites on your server can access. With it you’ll want to give the script its own sql database. That single database can hold the data for every page of all your sites. You can always continue to revise the module and add more cool features but while starting out it’s best to start simple. Create a table that has a field structure similar to ID,URL,KEYWORD,COUNT. I put ID just because I like to always have some sort of primary key to auto increment. I’m a fan of large numbers what can I say?

Page Structure & Variables To Pass To Your ModuleBefore we get deep into the nitty gritty functions of the module we’ll first explore what basic data it requires and how the site pages will pass and return that data. In most coded pages, at least on my sites, I usually have the title tag in some sort of variable. This is typically passed to the template for obvious reasons. The important thing is it’s there so we’ll start with that. Let’s say you have a site on home theater equipment and the subpage you’re working on is on LCD televisions. Your title tag may be something like “MyTVDomain.com: LCD Televisions – LCD TVs”.

Side Note/BTW sorry I realize that may bother some people how in certain cases I’ll put the period outside of the quotes. I realize it’s wrong and the punctuation must always go inside the quotes when ending a sentence. I do it that way so I don’t imply that I put punctuation inside my keywords or title tags etc etc./Side Note

You know your keywords will be similar to LCD Televisions, but you don’t know whether LCD TVs would be a better keyword. ie. It could either be a higher traffic keyword or even a more feasible keyword for that subpage to rank for. You also don’t know if the plurals would be better or worse for that particular subpage so you’ll have to keep that in your mind while you pass the module the new title variable. So before you declare your title tag create a quick scalar for it (hashref array). In this scalar you’ll want to put in the estimated best keywords for the page:[Keyword1 ->

You know your keywords will be similar to LCD Televisions, but you don’t know whether LCD TVs would be a better keyword. ie. It could either be a higher traffic keyword or even a more feasible keyword for that subpage to rank for. You also don’t know if the plurals would be better or worse for that particular subpage so you’ll have to keep that in your mind while you pass the module the new title variable. So before you declare your title tag create a quick scalar for it (hashref array). In this scalar you’ll want to put in the estimated best keywords for the page:[Keyword1 -> ‘LCD Television’,Keyword2 -> ‘LCD TV’,]Then put in the plurals of all your keywords. It’s important not to try to over automate this because A) you don’t want your script to just tag the end of every word with “s” because of grammatical reasons (skies, pieces, moose, geese) and B) you don’t want your module slowing down all the pages of your site by consulting a dictionary DB on every load.[Keyword1 -> ‘LCD Television’,Keyword2 -> ‘LCD TV’,Keyword3 -> ‘LCD Televisions’,Keyword4 -> ‘LCD TVs’,]Now for you “what about this awesome way better than your solution” mutha fuckas that exist in the comment section of every blog, this is where you get your option. You didn’t have to use a scalar array above you could of just have used a regular array and passed the rest of the data in their own variables, or you could of put them at the beginning of the standard array and assigned the trailing slots to the keywords OR you could use a multidimensional array. I really don’t give a shit how you manage the technical details. You just need to pass some more variables to the modules starting function and I happen to prefer tagging them onto the scalar I already have.[Keyword1 -> ‘LCD Television’,Keyword2 -> ‘LCD TV’,Keyword3 -> ‘LCD Televisions’,Keyword4 -> ‘LCD TVs’,URL -> ‘$url’,REFERRER -> ‘$referrer’,Separator -> ‘-’]In this case the $url will be a string that holds the current url that the user is on. This may vary depending on the structure of the site. For most pages you can just pull the environmental variable of the document url or if your site has a more dynamic structure you can grab it plus the query_string. It doesn’t matter if you’re still reading this long fuckin’ post you probably are at the point in your coding abilities where you can easily figure this out. Same deal with the referrer. Both of these variables are very important and inside the module you should make a check for empty data. You need to know what page the pageview is being made on and you’ll need to know if they came from a search engine and if so what keywords did they search for. The Separator is simply just the character you want to separate the keywords out by once its outputted. In this example I put a hyphen so it’ll be “Keyword 1 – Keyword 2 – Keyword 3″ Once you got this all you have to do is include the module in your code before the template output, have the module return the $title variable and have your template output that variable in the title tag. Easy peasey beautiful single line of code.

Basic Module FunctionsInside the module you can do a wide assortment of things with the data and the SQL and we’ll get to a few ideas in a bit. For now just grab the data and check the referrer for a search engine using regex. I’ll give you a start on this but trust it less the older this post gets:Google: ^http://www.google.[^/]+/search?.*q=.*$[?&]q= *([^& ][^&]*[^& +])[ +]*(&.*)?$Yahoo: ^http://(w*.)*search.yahoo.[^/]+/.*$[?&]p= *([^& ][^&]*[^& +])[ +]*(&.*)?$MSN: ^http://search.(msn.[^/]+|live.com)/.*$[?&]q= *([^& ][^&]*[^& +])[ +]*(&.*)?$

Once you’ve isolated the search engines and the keywords used to find the subpage you can check to see if it exists in the database. If it doesn’t exist insert a new row with the page, the keyword, and a count of 1. Then select where the page is equal to the $url from the database order by the highest count. If the count is less than a predefined delimiter (ie 1 SE referrer) than output the $title tag with the keywords in order (may want to put a limit on it). For instance if they all have a count of 1 than output from the first result to the last with the Separator imbetween. Once you get your first visitor from a SE it’ll rearrange itself automatically. For instance if LCD TV has a count of 3 and LCD Televisions has a count of 2 and the rest have a count of 1 you can put a limit of 3 on your results and you’ll output a title tag with something like “LCD TV – LCD Televisions – LCD Television” LCD Television being simply the next result not necessarily the best result. If you prefer to put your domain name in your title tag like “MYTVSITE.COM: LCD TV – LCD Televisions – LCD Television” you can always create an entry in your scalar for that and have your module just check for it and if its there put it at the beginning or end or whatever you prefer (another neat customization!).

Becoming MR. Fancy PantsOnce you have the basics of the script down you can custom automate and SEO every aspect of your site. You can do the same technique you did with your title tag with your heading tags. As an example you can even create priority headings *wink*. You can go as far as do dynamic keyword insertion by putting in placeholders into your text such as %keyword% or even a long nonsense string that’ll never get used in the actual text such as 557365204c534920772f205468697320546563686e6971756520546f20446f6d696e617465. With that you can create perfect keyword density. If you haven’t read my super old post on manipulating page freshness factors you definitely should because this module can automate perfect timings on content updates for each page. Once you have it built you can get as advanced and dialed in as you’d like.

How This Works For Your BenefitHere’s the science behind the technique. It’s all about creating better odds for each of your subpages hitting those perfect keywords with the optimal traffic that page with its current link building can accomplish. In all honesty, manually done, your odds are slim to none and I’ll explain why. A great example of these odds in play are the ranges in competitiveness and volume by niche. For instance you build a site around a homes for sale database you do a bit of keyword research and figure out that “Homes For Sale In California” is an awesome keyword with tons of traffic and low competition. So you optimize all your pages for “Homes For Sale In $state” without knowing it you may have just missed out on a big opportunity because while “Homes For Sale In California” may be a great keyword for that subpage “New York Homes” may be a better one for another subpage or maybe “Homes For Sale In Texas” is too competitive and “Homes In Texas” may have less search volume but your subpage is capable of ranking for it and not the former. You just missed out on all that easy traffic like a chump. Don’t feel bad more than likely your competitors did as well.

Another large advantage this brings is in the assumption that short tail terms tend to have more search volume than long tail terms. So you have a page with the keywords “Used Car Lots” and “Used Car”. As your site gets some age and you get more links to it that page will more likely rank for Used Car Lots sooner than Used Car. Along that same token once it’s ranked for Used Car Lots for awhile and you get more and more links and authority since Used Car is part of Used Car Lots you’ll become more likely to start ranking for Used Car and here’s the important part. Initially since you have your first ranking keyword it will get a lot of counts for that keyword. However once you start ranking for the even higher volume keyword even if it is a lower rank (eg you rank #2 for Used Car Lot and only #9 for Used Car) than the count will start evening out. Once the better keyword outcounts the not as good than your site will automatically change to be more optimized for the higher traffic one while still being optimized for the lesser. So while you may drop to #5 or so for Used Car Lot your page will be better optimized to push up to say #7 for Used Car. Which will result in that subpage getting the absolute most traffic it can possibly get at any single time frame in the site’s lifespan. This is a hell of a lot better than making a future guestimate on how much authority that subpage will have a year down the road and its ability to achieve rankings WHILE your building the fucking thing; because even if you’re right and call it perfectly and that page does indeed start to rank for Used Car in the meantime you missed out on all the potential traffic Used Car Lot could have gotten you. Also keep in mind by rankings I don’t necessarily always mean the top 10. Sometimes rankings that result in traffic can even go as low as the 3rd page, and hell if that page 3 ranking gives you more traffic than the #1 slot for another keyword fuck that other keyword! Go for the gold at all times.

What About Prerankings?See this is what the delimiter is for! If your page hasn’t achieved any rankings yet than it isn’t getting any new entry traffic you care about. So the page should be optimized for ALL or at least 3-6 of your keywords (whatever limit you set). This gives the subpage at least a chance at ranking for any one of the keywords while at the same time giving it the MOST keywords pushing its relevancy up. What I mean by that is, your LCD page hasn’t achieved rankings yet therefore it isn’t pushing its content towards either TV or Televisions. Since it has both essentially equaled out on the page than the page is more relevant to both keywords instead of only a single dominate one. So when it links to your Plasma Television subpage it still has the specific keyword Television instead of just TV thus upping the relevancy of your internal linking. Which brings up the final advanced tip I’ll leave you with.

Use the module to create optimal internal linking. You already have the pages and the keywords, its a very easy to do and short revision. Pass the page text or the navigation to your module. Have it parse for all links. If it finds a link that matches the domain of the current page (useful variable) then have it grab the top keyword count for that other page and replace the anchor text. Boom! You just got perfectly optimized internal linking that will only get better over time.

There ya go naysayers. Now you can say you’ve learned a SEO technique that’s both pure white hat and no matter how simple you explain it very much advanced.

–>