100’s Of Links/Hour Automated - Introduction To Black Hole SEO
I really am holding a glass of Guinness right now so in all the authority it holds…Cheers! I’m kind of excited about this post because frankly it’s been a long time coming. For the last 7-9 months or so I’ve been hinting and hinting that there is more to Black Hat than people are willing to talk about. As “swell” as IP delivery and blog spam are there’s an awesome subculture of Black Hats that takes the rabbit hole quite a bit deeper than you can probably imagine. This is called Black Hole SEO. By no means am I an expert on it, but over the last few years I’ve been getting in quite a bit of practice and starting to really kick some ass with it. In the gist, Black Hole SEO is the deeper darker version of black hat. It’s the kind of stuff that makes those pioneering Black Hat Bloggers who dispel secrets like parasite hosting and link injection techniques look like pussies. Without getting into straight up hacking its the stuff black hatters dream about pulling off, and I am strangely comfortable with kicking in some doors on the subject. However lets start small and simple for now. Than if it takes well we’ll work our way up to some shit that’ll just make you laugh its so off the wall. Admit it, at one point you didn’t even think Advanced SEO existed.
In my White & Black Hat Parable post I subtly introduced this technique as well as the whole Black Hole SEO concept. It doesn’t really have a name but basically it follows all the rules of Black Hole SEO. It targets sites on a mass scale, particularly scraper sites. It tricks them into giving you legitimate and targeted links and it grabs its content on an authoritative scale (will be explained in a later related post). So lets begin our Black Hole SEO lesson by learning how to grab hundreds of links an hour in a completely automated and consenting method.
Objective
We will attempt to get black hat or scraper sites to mass grab our generated content and link to us. It’ll target just about every RSS scraper site out there, including Blog Solution and RSSGM installs including many private scrapers and Splogs.
Methodology
1) First we’ll look at niche and target sources. Everyone knows the top technique for an RSS scraper is the classic Blog N’ Ping method. It’s basically where you create a scraped blog post from a search made on a popular Blog Aggregator like Google Blog Search or Yahoo Blog Search. Then they ping popular blog update services to get the post indexed by the engines. For a solid list of these checkout PingOMatic.com. Something to chew on, how many of you actually go to Weblogs.com to look for new interesting blog posts? Haha yeah thats what I thought. 90% of the posts there are pinged from spam RSS scraper blogs. On top of that there’s hundreds going in an hour. Kinda funny, but a great place to find targets for our link injections none the less.
2) We’ll take Weblogs.com as an example. We know that at least 90% of those updates will be from RSS scrapers that will eventually update and grab more RSS content based upon their specified keywords. We know that the posts they make already contain the keywords they are looking for, otherwise they wouldn’t of scraped them in the first place. We also have a good idea of where they are getting their RSS content. So all we got to do is find what they want, where they are getting it from, change it up to benefit us, and give it back.
3) Write a simple script to to scrape all the post titles within the td class=”blogname” located between the !– START - WEBLOGS PING ROLLER — comments within the html. Once you got a list of all the titles store it in a database and keep doing it infinitely. Check for duplicates and continuously remove them.
4) Once you got all the titles steadily coming in write a small script on your site that outputs the titles into a rolling XML feed. I know I’m going to get questions about what a “rolling XML feed” is so I’ll just go ahead and answer them. It’s nothing more than an xml feed that basically updates in real time. You just keep adding posts to it as they come in and removing the previous ones. If the delay is too heavy you can always either make the feed larger (up to about 100 posts is usually fine) or you can create multiple XML feeds to accommodate the inevitably tremendous volume. I personally like the multiple feed idea.
5) Give each post within the feed the same title as you scraped from Weblogs. Then change the URL output field to your website address. Not the original! Haha that would do no good obviously. Then create a nice little sales post for your site. Don’t forget to include some html links inside your post content just in case their software forgets to remove it.
6) Ping a bunch of popular RSS blog search sites. The top 3 you should go for are:
Google Blog Search
Yahoo News Search
Daypop RSS Search
This will republish your changed up content so the RSS scrapers and all the sites you scraped the titles from will grab and republish your content once again. However, this time with your link. This won’t have any affect on legitimate sites or services so there really are no worries. Fair warning: be sure to make the link you want to inject into all these Splogs and scraped sites as a quickly changed and updated variable because this will gain you links VERY quickly. Lets just say I wasn’t exaggerating in the title A good idea would be to put the link in the database, and every time the XML publishing script loops through have it query it from the database. That way you can change it on the fly as it continuously runs.
As you’ve probably started to realize this technique doesn’t just stop at gaining links quickly, it’s also a VERY powerful affiliate marketing tool. I started playing around with this technique before last June and it still works amazingly. The switch to direct affiliate marketing is easy. Instead of putting in your URL, grab related affiliate offers and once you got a big enough list start matching for related keywords before you republish the XML feed. If a match is made, put in the affiliate link instead of your link and instead of the bullshit post content put in a quick prewritten sales post for that particular offer. The Black Hat sites will work hard to drive the traffic to the post and rank for the terms and you’ll be the one to benefit.
Each individual site may not give you much but when you scale it to several thousands of sites a day it starts really adding up quickly. By quickly I mean watch out. By no means is that a joke. It is quick. There are more RSS scraped pages and sites that go up everyday than any of us could possibly monetize no matter how fast you think your servers are.
This made my frickin’ head explode. I can see I am going to be writing some new code in the wee hours of the morning tomorrow…
Agreed, this is going to cost me at least a days work!!!
That’s the way now you get banned !
Really youre true !
One day worth of coding that can bring hundreds of hours of backlinks.
can you give us the code?
This sounds great
hi eli,
how do you do your nr3 script?
Is it enough to change the positions of the words, or should i kill some letter out of the title
like “this title is best” kill the “t”
and results in “his ile is bes” ?
nah, keep the exact same titles. Your not stealing anyones content. You are more than welcome to have the same post title. The only ppl this technique affects is RSS scraper sites so you can’t go too incredibly wrong with it.
Uff, for a noob this is too complicated. Can someone write a script which does that all those steps automatically, along with a visual interface guiding you through it?
I didn’t understand half of what you posted *lol*
I’d like to someone to write a script for me that does this, then install it on my website for me.
Then Id like that person to come over and wipe my ass for me: two wipes in a clockwise direction.
Gimme a break buys! Start learning how to do this simple stuff yourself and stop depending on the charity of others!
“Then Id like that person to come over and wipe my ass for me: two wipes in a clockwise direction.”
LOL, Lol, lol….
automatically? eli just layed out a completely auto tool.
being automatic, wtf would a GUI or “visual interface” even do? guide u thru what? its automatic!
btw eli: is this a way how i should promote my subdomains you wrote me via email?
Nice post.
This is a really stupid question but what language does the script need to be in?
you can do this with php,perl,ruby,python and more. If you are a forever noob like me, I would vote for PHP, because php.net is an incredible resource for gluing this sort of thing together. I think the way PHP handles things like dealing with mysql is easy to understand, and i think it will give you more possibilities for extension of these ideas by incorporating them into some of the wide variety of php website scripts.
php5 would be excellent choice!
Guys,
No one is going to hand over this script to you. This is valuable stuff. If you don’t know how to code it, pay someone to do it for you. But you can’t expect everything to be handed to you on a silver platter.
I actually have been doing this. The results are just as you described it. I didn’t know it had anything black hat in it, tho
Haha yeah, I wouldn’t call it very blackhat. After all, you are basically targeint gpsmmaers, so who cares?
Eli…when you did this with affiliate links, what were the profits like?
Georgi do you do any seo consultancy work or coding.
Anyone want to sell me a working script?
Hey Eli,
Couple of clarifications. What are you feeding the scrapers for content? I know you are giving them the weblogs titles, but where is the actual content they are scraping off us coming from ? Are you saying one sales copy for all posts, just with different titles?
Also, how many of these scrapers actually link back to the original article.. I don’t see that too often.
Does someone know the regexpression to scrape from weblogs.com? I don’t get it, the multiple lines are killing my tries…
Just str_replace the newlines out befor doing the regexp. Thats what I always do
http://www.ilovejackdaniels.com/regular_expressions_cheat_sheet.png
Look at pattern modifiers. /i and /m are your friend. I like to use /ims at the end of each one.
Very nice link man.
He’s back!
If i understand correctly, the idea is to create a RSS feed in xml for a non-existing blog, but that will contain instead keyword-targeted text with links to your affiliates links?
Then ping everything to get your feed scraped and used on blackhat sites, thus having your content published by blackhat people?
Writing a dynamic RSS feed is a piece of cake, if you need to get started you can look at the tutorial:
http://www.icemelon.com/tutorials.php?id=3&/PHP/Generate%20RSS%20Feeds/
Then we go over a technique called link baiting. Hey guys! The script for this is at my blog!!
Very nice tutorial Eli.. Maybe you should just hint at stuff from now on because I am sure this is going to get quite a rush in the next few days. Actually though, I don’t know how many people will go through with the program.
I think that’s the case with a lot of what Eli posts… very cool ideas that LOTS of people get all hot and bothered about… I suspect that very few readers actually get to the stage of coding and launching many of these ideas though for one simple fact… some effort is required.
That’s great news though, because the few of us that are actually trying and expanding on the ideas Eli is giving us will have less competition
Thanks again Eli for another awesome post!
Holy Cow Eli - my head just about exploded when I read this post. I love the idea - had wondered how this worked when you mentioned it in your last post. I’m definitely going to have to get someone to design something like this for me (I’m a lover not a coder ) Anyone on here who intends to make a copy of this holler at me with a quote for a price, I’ll make it worth your while
One meeeeeelion dollars!
SOLD!
Nah - not really - not got pockets THAT deep
I’ll let you guys know when I have my scripts written and working.
Now write me a script!
Tyler, where is the script on your site?
I couldn’t find it.
Thanks.
It was never there.. It was a joke.
By the way everyone.. From my calculation there are about 30 blog posts/second.. This is going to destroy my server lol..
I’m going to have to agree with most everyone on here.
If you didn’t entirely understand the post then don’t attempt it. There will be more posts in the future with lots of fun stuff you can try out. Just let this one go until you’re ready for it.
People with the regex question, it depends on what language your using but you will have to do a multiline match as well as accept multiple matches(usually m/ and /g) and put those matches into either an array or a scalar.
Guiness on Cinco De Mayo… “BRILLIANT!”… muchas gracias for another beauty, E
Regex isn’t the only way. If you can’t get your head around regex then you can use PHP (VERY n00b friendly) with some str_replace and substr calls, along with some while loops to grab any content you want. Regex is a faster solution, but it’s not the only solution… and not always better for performance.
Don’t use regex to parse html/xml, its way to hard and breaks all the time.
Use python, BeautifulSoup, is amazingly esay and works, now with that said, wow this is such a cute fun project it can be done in a simple shell script using the usual suspects, gred/sed/curl/lynx/etc
To get your targets, why not try
grep ‘.xml”‘ shortChanges.xml | \
grep -i “\(xbox\|game\|wii\|psp\)” | \
sed ’s; url=”.*” ; url=”http://my1337.com/rss.xml” ;’ | \
sort | \
uniq
As far as the weblogs example goes you can use their changes log. It’s about 2mb for every 5 minutes.
http://rpc.weblogs.com/shortChanges.xml
I shouldn’t have to say this, but Right click, Save Target As (Save Link As in FF).
awsome eli, thanks for this
Thanks for sharing this method, I’m excited to try it out. One question, for # 6) “Ping a bunch of popular RSS blog search sites.” How often do you ping, once an hour?
ping once per group of posts. So if you got an xml feed with 100 posts make one ping to each of the services. Then repopulate the xml feed with another 100 posts and reping again.
Do you use any specific software to do this? (the rss/ping stuff) or is it custom code?
Come on Eli, you always tend to put these complex techy stuff which I mess up. I get excited at the idea of getting more traffic and more money and finally end up getting nothing as I don’t do it right. Can we have a non techy post for all us non techy people who want to get traffic?
all your base are belong to us
Sorry buddy I don’t know what to tell you.
I post the simple ones every once in awhile but they quickly get eaten up just because everyone is capable of doing thing so quickly. Then everyone complains about not getting the info it in time and the idea becoming quickly saturated. It’s a bummer stigma, I realize. What would you like to hear about? Any specific topics you would particularly like to read about? You know I’m always open.
Before every post I usually take a moment to think about what I would like to read about. That usually becomes my post topic. Then I get to read further about it in the comments and learn even more. I know that sounds unfair but it’s sorta my balancing factor.
On that, this idea is actually very simple it’s just that I use a lot of jargon and don’t bother explaining any of it. It’s not that I’m unsympathetic to the people who aren’t familiar with the jargon, its just that this is after all an “advanced SEO” venue. So I can either explain every single thing on every single post(my posts are already quite lengthy in case you didn’t notice) or I can just cut to the meat of it.
Thanks Eli, that was fun! Much easier than it looked. Wait and see what happens next, going to have to figure out how to automate the pings. Right now have it set up to randomly (my personal favorite) pull titles, and tie it to a domain. May have to cache it, if it becomes a burden on the server.
that was the most confusing thing i’ve ever read.
Any particular step thats causing this confusion? Perhaps I can help explain it further.
Hi everybody. Hi Eli.
I’ve just grabbed and analyzed a bunch of titles of recently updated blogs. I got it from
http://blogsearch.google.com/changes.xml?last=60
Below is what I got:
BLOG_TITLES_START
weston
Database for Research Grants and Contracts
justin
http://denshi.hitchart.com/u.r/denshi/RQ2
My Wheels
weston
Real estate exam maryland
Jason Bartholme's SEO Blog
х_+ф║
That is only the first level scraping. You will need to further scrape those RSS feeds for the actual post titles and contents. We scrape RSS Aggregator only for the list of RSS feeds. Then, scrape those feeds for the actual post titles and contents that we want. Re-package (retain post titles, random post contents, replace the link portion) as your own feed . Lastly, ping your feed back to RSS Aggregator.
Hi Eli,
What format are your feeds in? RSS, RSS2, Atom?
Its a great idea, but methodology went above my head. Can you provide us the script and more detailed examples??
Please please please
Eli:
I found a much easier way of doing this which I won’t post here. E-mail heading your way in a few minutes
Jason
LOL, that’s not even fair. You can’t go around saying you have an easy way to do this, and then not tell us. It’s cruel I say, CRUEL!
Does the weblog url we ping with have to be where the “post” is stored? Or can we just make the post url be http://www.sitewewanttopromote.com/?
Hi Eli,
I would like to read about promoting your site through social networking sites. How exactly to go about it. You have covered this is various posts as tit-bits but I would like a single big easy to understand one. In a non-technical way ofcourse
I have begun working on a script for this, I will sell copies when I have finished.
John,
I am emailing you from your post at Blue Hat SEO.
You mention that when you are finished building your code that you would be willing to sell a copy.
I’m interested in being put on a List — when you email me if you use “Blue Hat SEO” somewhere in the title — I will be able to find it and respond quickly.
Warmest Regards,
Jeff — [email protected]
Hi John
Im interested in the script too
Good if you would put me on your list and email your paypal details
Kind regards
Mark
Eli:
I just sent you that e-mail
Jason
Great post! I’ve yet to come across another blogger who’s giving away tricks and tips like these
Anyway, I was wondering: at step 3 you’re talking about grabbing and storing all the post Titles from weblogs.com. Maybe I’m missing something here, but doesn’t weblogs.com only list the general -blog- titles?
I was wondering the same thing. I don’t see how just re-feeding the blog titles does any good.
Those are not the title you wanted. You need to further scrape the rss feeds to get the actual post titles and contents.
Thanks for such a valuable resource Eli. This blog just gets better and better.
One question: Maybe I’m missing something here, but could a “poor man’s ” way to do this be just to download http://rpc.weblogs.com/shortChanges.xml, find and replace everything between the url=”" tag and ping all the above mentioned blog resources?
Perhaps I’m missing where the actual blog post comes from.
thx.
There will be a follow up post to this whole thing by the end of the day.
Well, Weblogs.com is now worthless. Everyone and their brother will be using those titles. Time to figure out how to get the info from the other ping services.
My bg this is I’m trying to figure out how to filter out crazy titles that are not in english.
If you want to get really fancy and even targeted, write a way to categorize the post titles and then target links / title that are relevant.
By the way. I’ll just come right out and say it. In that post I made a blatant hint to the next Black Hole SEO technique when I mentioned authoritative content. So if you want a head start on everyone you might want to start thinking about how someone like me would go about getting unique content that has been proven to be authoritative in the search engines.
I think I know what you mean, and I borrowed some code from a “website generator” to help with this process. Let’s just say the posts read a little “differently” now.
Of course, I used less authoritative content…
Alright, here is the moment that you have all been waiting for
I have released my automated version of this to the public on my blog. Here is the link:
#EDIT BY ELI: LINK REMOVED UNTIL I SEE A COPY.
Jason
Hi Eli,
is it a good idea to combine this method with your network idea?
like one script for collection all titles and for each network site a ping script with their own rss feed
Not especially. That wouldn’t do a whole ton of good. However, it would be an awesome idea to combine this method with my Link Laundering Sites technique. Also, a little birdy told me this works great as an ultimate solution for “Power Indexing.”
Wow, Wonder if i can max out my two clustered dual cpu iis boxes….. Gotta love dedicated hardware.
Great Post - keep it coming! Your post really help me to think outside of the box, and to focus on ways to up my game.
Black hole SEO
Following up on yet another silly phrase made up by Mr Bluehat, I’ll tell you how to do black hole SEO in another way. You know what else people scrape a lot? Search engine results.
So how would you go about making people scrape SERPS that inclu…
That reminds me, there’s a search engine API that I was scraping for extra content on an experimental site and the funny thing was after 1.5 months my scraped pages started showing up at the top of the results in that engine and my own pages were showing up in the results I was scraping.
Hey Eli,
Nice post! Some questions thinking about Google and contents being related.
As always if I am wrong or not getting it right, let me know please.
1 - Wouldn´t you filter those titles in order to make them related to the content you will use in your XML feed? (Relevancy)
Would be great to parse having in mind the niche you want to target (this would imply we will be using multiple resources to scrape titles and get some matches related to our content). This way, you get links from scraper-made websites targeting specific niches.
2 - “The switch to direct affiliate marketing is easy. Instead of putting in your URL, grab related affiliate offers and once you got a big enough list start matching for related keywords before
you republish the XML feed.” (Quality)
We talked a bit about this on your last post where you introduced the idea.It seems we will get tons of incoming links fast but these are splogs as you said (poor quality). Seems ideal for heavy link spamming processes and short term affiliate revenue.
Not that much for websites with long term aspirations.
In your last post, you mentioned that websites, blogs, etc could scrape contents from white hats defending their position against BH (including links to their websites) or unwillingly insert links to banned domains thus decreasing
their value as a source of incoming links for the BH webmaster (as a matter of fact this was the only part of your post I had some doubts about cus since white hatters are getting links from there also…aren´t they harming theirselves at the same time? Why not just insert those banned domains and let BH get horrible links that harm their rankings solely?)
As you can see, my doubt is always revolving around the benefits of these fast link building schemes when it comes to SEO projects. Link velocity will be great but what about results in the long run. What will prevail? What do you think based on your experience?
Keep it up!:)
Nick
Heya Nick,
Great comment/questions as always. I’m glad you post them as comments so everyone can read them.
First your first question, I build links for volume. I build links for relevancy. Just by personal policy I never mix the two. Simply because obviously whenever you try to do both one, the other, or both will naturally suffer. So I always try to do each to it’s maximum capacity in a separate manner. It hasn’t failed me yet.
However in this case, if you were to do the affiliate offer rather than going for the straight link building than definitely. A reader already commented on this on the follow up post. He solved the targeted traffic problem by gathering tons of affiliate offers, making a solid list of keywords for each one, than attempting to match each possible title with an affiliate offer. I would stretch that one step further. I would put a priority on the affiliate offers. So if a possible match could be made, than i would insert the affiliate link instead. If no match could be made than I would use the title get an inbound link. Kind of like a Link Laundering technique on steroids.
I think i just accidentally answered your second question. Indeed we did talk about it. With the same intent as before, I would use these links for traffic, or use them for SEO purposes. Mixing them is fine, but do it tactfully and like you said targeted. These are not the highest quality links in the world but many will have solid link authority because the owner may drive some massive amounts of deep links to his pages through link bombing practices. More often than not, these link bombing techniques will involve gathering relevant links. So even though his page may not be relevant to your site, it can still pass good authority, thus boosting your rankings.
So don’t focus on being worried about building links too quickly. It seems logical that search engines would think about this and consider it a bad thing, but its simply an impossibility to make applicable without drastic consequences. Take for instance the presidential race going on right now. Each candidate has a brand new website. Instantly over night they all got absolutely massive amounts of links, most completely irrelevant and from random blogs and sites that have nothing to do with their subject. Can you see a single site that doesn’t rank for its terms. Even the celebrity candidates toppled down everyone whos been in long standing already. Gaining links too quickly is never as much of a problem as gaining links too slowly. Although if you are the type that spends 7 hours a day hitting refresh on the results page for your terms, than sanity may be a factor.
I think its a damn shame that people here are yet to truly understand and realize the power of my old Synergy Links post. If you completely ignore the entire technique itself and strictly comprehend that it is entirely possible to change the relevancy value of a group of inbound links, into a high quality and relevant link the SEO world is your oyster.
Here’s how you can create hyperlinked keywords from the short change weblogs file on the command line.
wget http://rpc.weblogs.com/shortChanges.xml
cat shortChanges.xml | grep “weblog name” | perl -ne ‘/”(.*?)”/;print “$1\n”;’
and the result …
Dear God Part Two
My blog 710
trip, etc.
Demetrius
Dave
Thanks dave. thats perfect.
What exactly is that cat going to do for you ?
/me spent waaaay too many hours on #sed,#awk and #grep
;)
mmh how would you filter out titles that look like
跳呀跳的,我è¦�?è·³é€²ä½ å¿ƒè£¡ or something?
i think titles like that would not be of any use..
use utf8_decode and then filter titles that contain 3 or more ? characters.
This eliminates 90% of the titles like that.
thanks, works nicely!
I use this to filter out those characters:
$val = iconv(”UTF-8″,”UTF-8//IGNORE”,$val);
it does the trick.
Reformed Adult Webmaster Reveals Cutting-Edge Marketing Secrets
Cutting edge internet marketing secrets revealed for your home business.
Introduction To Black Hole SEO
You’ve got to love a guy that reveals the deep secrets of SEO in step-by-step detail.
…
Hi Eli,
First of all - thanks for an awesome blog. I’ve learned a lot from you! If you ever come to Copenhagen, I’ll be the first to take you out for a beer and some hot Scandinavian chicks :p
Anyway, I got an answer:
- How often do you ping the blog search engines with your feed?
Anyone can of course feel free to answer..
Hi,
I’ve been scraping and pinging for two days now.
For some reason I can’t ping Yahoo - I get a 403 when I use the regular ping, and I can’t auto post a form to them due to their beacon cookie shit.. Anyone got a solution?
So I’m left with DayPop and Google Blogsearch.. however, Im not seeing my links in their index yet.. Am I just impacient or doing something wrong?
Thanks
anyone has a worked scripts or any demo now??
hey Eli,
Wonder in the SEO World….
I am able to populate the Database every 10 minutes with the Title and URL from
http://rpc.weblogs.com/shortChanges.xml
Now my doubt is, Do i need to go to each and every URL of the corresponding title to retrieve all the post titles??
Hi Decipher,
I was wondering the same thing. It seems, however, that if a blog post has a title (or if the blog template shows the post’s title as the page’s title, or if the services the owner pings somehow manage to get the title) then these XML feeds will contain the title of each post. If not, then they’ll contain the title of the blog.
I noticed that it’s got quite a few private myspace blogs (definitely not splogs, or they wouldn’t be private) and quite a few posts that only contain the blog title rather than the post’s title.
Well, guess it’s up to each of us to optimize our scripts accordingly….
Good luck.
Eli,
help on a few things. Forget about the technical part.
1. Why will google/yahoo blog search index this xml that you’re proposing to create?
2. Presuming they do, how long before it google/yahoo blog search becomes polluted with other junk content
For example, first page of google blog search yields at least two splogs focused on adsense earnings
http://www.newsmob.com/step2.php?id=2761632
http://mortgage-rates.jeremymorgan.com/2007/05/27/uncategorized/katrina-still-bad-for-business-in-pass-christian/
hey Eli,
Again good job with another very intriguing idea. I’ve got a question if you don’t mind clarifying. I understand that you scrape the titles and place them in the db. When you republish the XML feed, what do you use for content?
Hi very cool post. However i wondered about one thing:
“So to keep in the up and up the Black Hatters always be sure to include proper credit to the original source of the post by linking to the original post as indicated in the RSS feed they grabbed. This backlink slows down the amount of complaints they have to deal with and makes their operation legitimate enough to continue stress free.”
I get that this whole concept is predicated on the black hatter/rgssm linking back to the source. But do they really do this? I was of the opinion that they wouldn’t care if you get credit or not, and your unlikely to find out if you do steal your content so why would they?
Thanks again
[…] 100’s Of Links/Hour Automated - Introduction To Black Hole SEO […]
Hey, great post.
One question. What would you recommend putting in the main title, link, and description tags of the channel element of the RSS feed? Should it point to the site being promoted, as well?
Also, instead of submitting to those three linked-to blog search sites, would it work better just to ping pingomatic?
It’s me again with a couple more questions.
Firstly, how often would you recommend pinging those blogsearch sites?
Secondly, with the multiple feed idea, what would be the sense of that? wouldn’t it just result in a lot of stale data as the old feeds are no longer updated and only new ones are generated?
I appreciate any replies to the above.
Brandon, I don’t think that pingomatic has RSS content. It just sends your feed to sites who do.
Hi,
Yeah I realize that. I meant getting the content from weblogs.com and pinging the suggested places like Google’s blog search, etc, inaddition to pingomatic? Would that work? They do have an RSS URL option.
I’ve implemented this and it’s been running for a few days now. Every 5 minutes it downloads the latest titles and puts 100 of them into a database, which is read by an RSS file. That script also pings about four different places. I don’t really know how it is working yet.
Hi Eli,
Great post and blog thanks.
I have one question about this method.
It relies on just pinging your xml feed will get you ranked in google and yahoos blog search so the scrapers scrape your links.
Surely it takes more than a ping to get ranked?
Thanks
Leon
Great post, I will definitely try it. I just have one question. The rolling RSS file that is produced - does it just contain the titles and links that have been saved or does it also contain post content?
Basically, are you just producing the same file that you parsed with your links injected
What’s the legal status of user comments (like this one)? Are they copyrighted as well? If so, who retains the copyright?
Hmm, I dont know if that would work, im sure they do a remove html tags in their scraping.
Well, its 2 months on now. I’m sure most of you who did it are now listed in feeds. Any comments on how well it worked or didn’t? I’m curious.
comments would be welcome. I’ve run it for a few days now, and google,etc is picking up the posts, but I wont know if any splogs are picking them up for quite a while.
Blackhole, sounds almost like physics. Hmmm
[…] 100’s Of Links/Hour Automated - Introduction To Black Hole SEO […]
Man, this is the best Seo blog i’ve ever seen
Well I’ve got my scripts set up and running… just a few quick, general questions:
1) The scraper sites re-scrape blog posts if posted with the same title?
2) How long does it take on average for the aggregators (after pinging) to index your “posts”?
3) The content for each of my posts is the same, is that a problem?
4) Eli–how in the world did you think of such an intricate strategy? Love the site.
-Drew
this is a great idea… i love it.
i want to join to Andrew questions.
I also want to show you a little something that i built.
it uses this system, you can use it with your own site..
at the moment - it’s not sending the pings, only give you the xml file (it looks static, but it’s dynamic and takes random titles every time), so you have to send the pings.. so you have to send the pings your self.. i’m working on that right now
the only trick is… well, i take 5% of the links
if that’s okay with you, feel free to use it
http://exe.netzach.biz/sys/
btw, sorry about my english, i know it sucks.
good day, Nadav
UPDATE: It sends it to Ping-O-Matic, Google Blog Search and Daypop RSS Search, and I’m working on Yahoo.
Please give it a try and tell what you think
Nadav
Nadav
It seems like you generate posts based on the same 5 articles over and over. Try a few sample links and you’ll see what I mean when you open the xml file.
numba1, sorry, i didn’t understand you. it has a lot of different titles, much mroe than 5
are you talking about the description? if so, yeah, i need to put some more different description… i’m too lazy
and another thing.. i don’t think the description is so important…
I sdtill dont get why you need to scrape weblogs. Why not just generate a random title with relevant keywords in the ‘niche’ of your choice?
This site is great, been lurking for a while and decided to give this one a go.
Regarding the “rolling XML feed�?. So I have written some code to do automate this, but due to the volume I am creating a new feed file for every 100 entries. My initial reaction was to create the new file, ping the services, and delete the file (to save disk space). Does this feed file need to exist for a period of time after the ping for the services to process it or does it get processed when the ping is submitted?
Thanks for all the great info - keep it coming!
Eli,
Holy crap man, this post at first blew my mind! But being a perl freakazoid, I decided to tackle this one, to see what I could do with it. So yesterday, I whipped up some code, tested it out, and finally implemented it last night. Like I said, holy crap! The results were pretty immediate, in a place I would never have expected.
I am looking over my stats for today, and I the first things that is popping out at me so far are the search terms that resulted in hits! My site is all music related, so I was struggling to develop that portion, and up to now, it was only music terms. But today, I am getting hits from search terms like “Ron Paul”, “My Braces” and “download partition magic”. Crazy! The only thing that changed was to implement your idea!
On a side note, when I create the feed, I do a rough filter so at least get into the ball park of relevancy.
Thanks a million times over man!
Hi, Jim. You said the results were pretty immediate. I tested my feed out on two of my sites. The second being 4 years old.
SE bots came and grabbed my feed only a few times no matter how often I pinged.
I checked my feed on the rss validators, it’s ok. I also provide different content every time I output the feed.
What’s wrong? How can I make the bots come more often and grab more content?
I just wrote the script and started doing as per your post. it is really awesome info you posted.
surely, i’m doing what you are doing
I just started the whole process and some important questions come up.
I’m reviewing my server logs after I pinged blogsearch.google.com and seeing an interesting picture.
The bot came a few times, grabbed my rss feed and suddenly stopped comming for more no matter how many times I pinged google.
I pinged a dozen of times over the hour only.
Second, do I have to change the description of my sales copy for every in my xml feed?
Is there any danger of duplicate content in the feed when the same description repeated over and over again? Or do I have to use kind of markov chaines for every description?
Does my site have to be a blog to provide rss feeds to aggregators or it can be an ordinary site?
I feel your blog is good,and method is good too,but i want to how to do for a web site not blog?
email me host:hush.com user:mr_man
if you want a free version of the script that does this.
Email you where ?
If you can’t figure out my email i really think you should be concerning yourself with another part of internet marketing.
What about a free script? or a paying script?
It is very risky
Google hate it
Andrew, sorry, I haven’t stopped by here in while, and missed your question entirely.
I am seeing the same things your seeing to some degree. I ping out every 10 minutes, so in theory, I should be seeing the bots hit me 144 times a day or so. Yet looking at my stats, I see, for example 197 hits on feeds.xml. Not sure why this is, or how to fix it yet, but I am researching.
It might be related to the where and how my pinging script works…it was a real hack, and just today, I rewrote it using some code I picked up out on the net in preparation for selling this stuff. In addition to doing a far better job of pinging, I am also pinging a hell of a lot more services. Previously, my pinger was hitting two sites only.
I’m not too worried, as I am still refining this, but even as it, I am still adding inbound links every day, and actually see a lot of hits from this method in my metrics. This is also in addition to the every growing weird search terms that bring people to my site.
Excellent! I can’t believe how much goes on under the surface of what I think of BH stuff. I can’t wait ’til part 2!
would anyone be interested in setting this up for $$. Let me know. justfurryfriendscom user: jp
I’m with Ozzmo, someone holler at me. I want to pay money to get someone to start implementing these tactics and get a better grasp of this type of approach.let’s get ‘er done.
yahoo user:cjayhey
I just wrote an implementation of this a few days ago… not sure it’s exactly the way Eli prescribes but it works damn well… hundreds of pages found in Google for a distinct blurb I inserted into all my rss entries.
The thing that really gets me (other than how fast it works) is that the splogs really do link to me, and of all the one’s I’ve looked at, only about 5% use no follow… crazy.
Thanks Eli!
If you want to buy a fully working script contact me tarzanby[ at ]tut.by
Andres i will mail you now i really need this one ::)
email me with the subject “xml rss reparser” in the email. I get lots of spam, might lose some requests.
Hi Eli,
Im now energised to learn php and dbase after reading your site. Now the typical thing WH’s and BH’s will ask…
Do you use this method to build backlinks for proper whitehat sites or your throw away domains?
I’m also interested in hearing the answer to this.
This definately looks interesting… I am an ASP programmer, but I may be interested in paying someone to do this for me in PHP… email me your price PickeringtonChris (at) Yahoo (dot) com
I really dont know much about programming but I a extremely intertested in this discussion of backlinks. If anyone has a good working script, please email me details, cost and where to pay. I have paypal. My email is karl[at]shentel.net Thanks.
I love your site! Thanks for the great information. I am seriously so new to this, so I have one general question. If I want to learn programming and eventually get to a point where I can read this post and implement what is being said, what is the best place to start? Any particular books, sites, ect?
And I will pay someone for the script mentioned. Please email me at zenterprises19 at yahoo.com with details. I can use paypal or another method if you prefer.
thanks!
This is the most “i finally found some cool trick”-feeling site I came across in a while.
I certanly do want such script to and am willing to pay.
Email me at host:gmx.net user:natadd
I’m willing to pay for this script also.
Email me at sunsolutions02 [at] yahoo.com
signup for a free account on squirt. look in the vault. theres one for sale. Also theres a windows app you can buy from rudedogg.
Hello, nice script and it’s look promising i am interested and willing to pay for it, email me at: info [at] seo-proz.com
thank you in advance.
why would anyone want to sell it when they can use this for their own sites to increase traffic, which increase $$$
Times like this makes me wish i knew how to code. Have to get a friend to do it.
Let me see if I’ve got this straight then:
1. Read http://rpc.weblogs.com/shortChanges.xml
2. Extract all the URLs and read each one, looking for any RSS feeds on the page.
3. Read each of those RSS feeds and scrape the ’s.
4. Build your own feed with those titles and ping it to google blog search etc.
Anything else I need to know?
Step 3 had the html stripped. It should say:
3. Read each of those RSS feeds and scrape the titles.
Thanks for this post!
It was easy to read and comprehend, and I hope to implement it very soon and enjoy the results!
Thanks Eli!
Hi.
I do not understand why I always get an empty response on the google ping changes xml at http://blogsearch.google.com/changes.xml
If I do a “HEAD” to get the headers on the same url I get “502 Bad Gateway”. Do I need to issue a POST request somehow or what ?
I don’t get it. Because I’m outside of U.S. or what?
Kindly
//Marcus
google blogsearch just don’t like you
Did anyone have any success implementing this? What about the results? Is it worth implementing it?
Guys isn’t Google smart enough to know that gaining 100-1000 links per day or hour, is seemed as spam and will derank you faster than a gerbil crawling through Richard Gere’s asscrack?
Sorry for bumping this, but your site will go through the black hole if you get links too fast.
I remember reading a bit about Google’s search patent in which they explain the factors that determine ranking, and speed of gaining backlinks is one of them.
This doesn’t mean gaining backlinks quickly will have a negative effect - the moment it does, as Eli pointed out in another post, the rules of the game will change because you could poison your competitors by rapidly gaining crappy backlinks to their sites.
Clearly rapid backlink growth will not give you the same advantage as slow and gradual growth, but you’d still be better ranked than sites that have only a few backlinks.
hi,
like obviously many others, i still don’t understand where the content is coming from? do i have to scrape it too and save it on my site (stealing content), or do i just link to it, and put the links to mysitetopromotedotcom into the title?
my email: hammer(at)web(dot)de
i would be thankful for any explanation of this issue.
also i’m interested in buying a working script in classic asp or php…
thanks for this great blog. i love it…
peter
Holy Cow Eli - my head just about exploded when I read this post. I love the idea - had wondered how this worked when you mentioned it in your last post. I’m definitely going to have to get someone to design something like this for me
Making scrape sites is easier than ever. Thanks a lot for this amazing post !
I am going to run my feed from mysql db and pullout random results on every requests.
I have started working on script and will keep it on sale once I finished. So the script is going to give me backilnk and few bucks.
by the way, out of box idea Eli,
Hats off !!!!
This is nо spam.
Its very funny site.
http://vidpicture.blogspot.com/
Update every day.
can you share the codes.. where I can see it??
Excellent!
I just found this script and will give this a shot.
Going through the script to understand what it does.
A++++
G.
This is quite a bit out of my league and I don’t think I’d ever try something like this but the posts read well and was entertaining.
Do you recommend doing something like this with brand new sites?
that sounds fantastic. But also very difficult..
I gave this a shot so I thought I’d share my results. Not the same as those of others who have tried it, I warn…
1) Getting the blog titles: You don’t need to go to the blogs to fetch the post titles. This is very time consuming. I tried it and you won’t get very many titles this way. You can get the titles of the blogs directly off of the home page at http://www.weblogs.com/. There are 10 (if I recall correctly) and they are all inside <td class=”blogName”> tags. They refresh about every 5 seconds so this will not get you as many titles as you see on the roller. By examining the javascript I found a page at http://weblogs.com/pingservice?action=weblogsping which has a lot more titles and is updated about once a second. You can scrape 35 to 40 titles a second here without a second thought since the javascript in the home page is doing exactly that - fetching this page once per second (or thereabouts).
2) Affiliate products: I chose to do the affiliate version of the plan. I created a ClickBank database with 6000 or so products. I would run through the blog titles looking for a match with the affiliate product. Getting a good match requires very sophisticated techniques which I don’t have. Sometimes the match would be completely irrelevent but sometimes it was good.
3) RSS files: I maintained 24 rss files. I updated one per hour so each file remained unchanged for a whole day. Once per hour I would go through step 2 until I found 100 titles. I filtered out obscene stuff. You may wish to do the same. I would set the title field to the one I scraped from weblogs.com, the description field to the ClickBank product description. The link would be my cloaked CB hot link.
4) Results: I ran this for 6 days then stopped. I omitted 16 January as it was a partial day. These are the dates and unique hosts for the day period:
Date Unique Hosts
17 95
18 49
19 25
20 20
21 31
As you can see this is not server-busting traffic. The feed crawlers account for about 12 to 15 of the hosts. Which means that I got anywhere from 5 to 80 visits a day. As these results do not square with the experiences of others I would appreciate if any errors could be spotted.
Software used: Linux, perl, mySQL.
Thanks,
Peter
Another Excellent post.
The main issue is that google is cracking down heavily on this
Of course google is cracking down on this stuff, they always crack down on black hat/hole. Thats why you pump as much traffic/links as possible before they shut down your site/loophole.
Quantity ftw when it comes to black techniques.
I think there will be other Google update soon because of this
Thanks! This is perfect to start with. Anyone know howlong this will work? Hosting
This method will only work for 37 days, 4 hours, 16 minutes and 33 seconds from the time you posted this.
Better hurry.
Eli,
I am a newcomer to your Bluehatseo blog and I’m blown away! My knowledge of seo is sketchy to begin with, and what I know of “blackhat” doesn’t really go any further than xss, blogfarms, and cloaking. I really appreciate you–and everyone who comments–for taking the conversation as far as you have away from the mainstream.
By the way, in one of the other posts or comments, you mentioned a coder you trust. I’ll find that post again, eventually, but any chance you or one of the other reader remind of the name? You identified the person as tobcn or something like that. Thanks!
I take my (black) hat off to the guys out there who can make sense of this and make it work for them.
No doubt most of you are less than half my age to add insult to injury
use with affiliate program or free hosted blogs and take no care if google shuts you down.
Can you send me the code of the script to use ?
Thanks in advance.
Mike
this can be done without a db, just store the titles in an array. go through the scrape a few times until you end up with 100 or so titles (after removing duplicates and foreign) then create a static xml file, based on the time or date, then ping your favourite search engine with the file path.
i have my script wait 10 seconds in between scrapes and does it 3 times, grabs about 30 titles each time it runs. so i end up with a static rss feed, 30 posts long. not bad for about 50 lines of php and no DB!
cheers Eli.
p.s heres some regex for the weblogs titles…
‘/” class=”pingLink”>\s(.*)/m’
just remove the rubbish around the title with a couple of str_replace’s
or email me for the weblogs.com scrape and ping script, [email protected]
This method no longer works. I’ve been able to ping Google, Technorati, Ping-O-Matic and IceRocket. I’ve had 0 luck getting more links, in fact I even lost a link. The bad thing about coming up with great ideas, such as this one, is that once everybody starts doing it, it winds up getting “patched”. I find it very hard to drive traffic to my sites, and I’ve been trying many methods with no luck. Legit SEO is VERY time consuming, while using a more “automated” approach seems ideal. I realize that automating SEO is frowned upon because everybody would be doing it. However, there’s GOT to be a way to bring traffic to a site, and make a few bucks off of google ads. Even sites with “real” content, not generated content, are not coming up in the first couple pages of the search engines. So, if somebody can be a visionary for a SEO strategy that will work, PLEASE contact me. I’m an idea man, but this time I’m out of ideas.
Matt
sinack @ mail DOT ru
I would love to generate that many links that fast.
I’ve only recently found this blog and find it hard to believe just how advance people like Eli are.
Awesome!
Fantastic, not the sort of thing I expect to see you blogging about but interesting none the less. Do you have any test cases you can show that get rankings?
You are naughty man with your black hole seo. Man, I think I have to test this, because I need a lot of links pointing to my website with follow.
This method works well with a little tweak
Anybody have any luck with this?
Hi. And may we know what the ‘tweak ‘is?
Hi, is this really work? What tweak I need to use?
Hi, I try to figure out what this tool is for, but can’t see any info about to tell me all I want to know… is someone use this? Please tell how it work.
Regards
this is amazing, really blew my mind!
i hope it still works as i’m gonna have a go at implementing this soon.
great site Eli!
This sounds great
Why don’t you make a tool to do this automatic?
just tell me how much i should save!
Fantastic idea Eli and those who are able to implement this. Just shows how inadequate a noobie like me is
Very interesting idea!
http://www.love2u.ro
regards!
this is amazing, really blew my mind!
i hope it still works as i’m gonna have a go at implementing this soon.
I try to figure out what this tool is for, but can’t see any info about to tell me all I want to know… is someone use this? Please tell how it work.
Sounds good in theory, but not worth the time if your’e doing the research yourself
I believe G will continue it’s crackdown on anything seen as remotely BH
Why don’t you make a software to do this automatic?
good idea, up your tips
It’s really interesting idea.
www.razibmiah.com
Great Thanks for you , whatever the post is very old but it is realy fantastic and very helpful
Fantastic idea Eli, let me kick my ass first and get me going to implement this. Thanks so much.
Anyone actually try this?
I have done as described and it quickly, I mean QUICKLY, ate up all my bandwidth. I had the RSS XML feeds on a server that was being trashed with all the bots coming to check out my updates. I can not comment on increasing links, but I definately saw a spike in traffic to the pages (on a different server) I linked to in the XML which therefore increased my adsense earnings. So I would like to keep this going.
Any feedback regarding how to create and ping the XML without making a house payment to pay for the bandwidth?
Just get on a hosting plan with better if not unlimited bandwidth.
I have only just started this strategy, so still have to find out the effect. I have pinged around 500 new xml feeds already. Reread the article and I found I forgot to add html links to the descriptions. Will do that on my next batch!
Wow you cant find this kinda stuff any where this whole blog has got me jumping up and down!!
But you did’nt introduce us complete to the black hole seo!
this an obsolete method?
Google made sure this doesn’t work anymore..
Wish it works
Can anyone confirm that Google is still allwing this. It seems a very viable serp solution.
tetst
Great blog with great ideas. I wish I was a programmer. But alas.
The dumb thing is that once these techniques are published they don’t work anymore after a given time, due to all the followers trying it out
Interesting concept…..one Iam going to try out before the big G stop it in its tracks.
A very intersting read …. I guess some of the holes would have been plugged now, however I am sure there are plenty of points raised that will help. Thanks
So you’re beating the sploggers and scrapers at their own game, eh?
Clever.
Great article, explained very well.
Explained very well but still hard to take in, just shows how clever some of the people out there are.
automation is key to black/grey hat SEO.
I don’t believe this works anymore; but it is amazing how clever people can be. I do agree that Eli is causing our minds to bend with his SEO and IM posts. I just wish he had more of them.
Hey Eli,
Love the technique, followed it enough to do it on my own. I got as far as creating a php script that get the titles of the posts on weblogs.com, made the rss feed to go with it, got the RSS pinger ready, only thing is my XML feed rarely validates because so many of the titles contain ‘illegal characters.’ I’ve tried using functions like php’s htmlentities but no matter what I do I get this message when I try to validate the rss:
XML parsing error: :12:75: not well-formed (invalid token)
Thus my feed will rarely get spread around because it doesn’t validate, right?
Any suggestions?
just like Dave (see above) it seems that I have the same problems with my rss-feeds:
XML parsing error: :12:75: not well-formed (invalid token) Any suggestions? Regards Annemarie
RSS Feeds giving me trouble too -
XML parsing error: :12:75: not well-formed (invalid token)
Free dinner for whoever fixes this!
Here is the php code I use, not 100% but better than nothing:
$urls = $doc->getElementsByTagName( "weblog" );
foreach( $urls as $url )
{
$EntryName = $url->getAttribute( 'name' );
$validtext = TRUE;
// test for ASCII Characters higher the 127
$test = explode(' ', $EntryName );
foreach($test as $s) {
if (ord($s) > 127){
$validtext = FALSE;
break;
}
}
if ($validtext)
{
$modEntryName = str_replace("’", "", $EntryName);
$mod2EntryName = str_replace("»", "", $modEntryName);
writeLink($mod2EntryName, $link);
}
}
Thanks for posting up the sample code PonyRyd
Np
Call me stupid but I don’t understand how this benefits you if the scrapers strip your links. I know I would!
Because some will do and most of them won’t. So if you are doing this on a big scale, this will still work out for you. Plus you can include a big
This Content Belongs To MYsite.com
in the end of each post, so atleast if you dont get the link back you sure will get publicity and probably good hits.
I first time come to know about what is Black Hole SEO, may be it will good foods for increasing my knowledge, still i am very much confused, how it could be differ from Black Hat SEO…? is there any specific meaning of word “Hole” in this…
So, I will pass my php-learning-sessions and then come back to that THX
PR 10 - here we come!!!
This is a really great idea. I know I am not ready to pull it off just yet, but it does give me something to contemplate for when the time is right.
If someone could install this on my server ( I am too dumb to figure it out) I have the script for and cron job and all that good stuff to do this on auto pilot .
LOL. ReGeX.
I found this script at esrun’s blog but I am too thick to get it going right. Can’t get the pinger to work and it is a bitch to install. Definitely one for the pros.
I wish someone would take some pity on us less advanced tech guys and really dumb down the whole process of installing and executing the script.
Anyone want to sell me a working script?
Thanks! This is perfect to start with. Anyone know howlong this will work?
wholesale electronics
omg.but i don;t know how to use
If someone could install this on my server ( I am too dumb to figure it out) I have the script for and cron job and all that good stuff to do this on auto pilot .
Nice info! Gonna have to try this. Can anyone share some recent results? Is it still working?
I wonder if anybody buy Eli is making this work. Dave above asked a good question, what happens if they strip the html out? Then how are you gonna get any links?
Thanks a lot. Very useful info.
Very good stuff - blackhole SEO …love the name!
I’ve never heard of blackhole SEO…where can I find more info on this?
This makes for some heavy reading. I’ve read it a couple of times now and I think I’m starting to grasp it. I’ll spend quite a bit of time on it over the weekend. Thanks for the comprehensive information though.
Nice article, but I didn’t understand something.
After you create your own rss feed that links to your affiliate pages for example, why would the rss scrapers keep the link to your site?
I thought that these people are only interested in the CONTENT of your RSS, so once they copy your content, why should they link to the scrapped content source?
Thanks
Idan
Give each post within the feed the same title as you scraped from Weblogs. Then change the URL output field to your website address. Not the original! Haha that would do no good obviously. Then create a nice little sales post for your site. Don’t forget to include some html links inside your post content just in case their software forgets to remove it.
I read all and nothing from this I do not understand
Nice post..
Google like RSS, Backlinks, Page Rank Button etc.
The essence of Scraper site
Thanks for this great post ,I will try it soon.
Guys,please don’t spam.thanks.
Ive been doin it. It totally works man.
Nice idea that may work well with affiliate offers but I don’t think it would be a good idea with adsense as you might get banned due to the fact that it is scraped content.
Good thing will be to try stuff like infolink,kontera or adbrite, they pay little less then adsense but what the heck if you are churning out 100’s and 1000’s of pages in one day you returns will be equally more.
Once implemented, how can you tell if your feed is being scraped?
@seoeoeo
I think that you would be able to tell once you start getting the backlinks and/or traffic from the sites scraping your feed.
@eli
Once you have this setup do you ping the appropriate servers from your server? In your ‘Blog Ping Hack’ post you said that you avoid pinging from your server at all costs, so would you handle this job the same way as the Blog Ping Hack? Would you recommend combining the two techniques?
Black hat is bad for ranking.
I’m sucked into the black hole. This information is overwhelming!
Thanks again, will put this guide to use right away!
Good that it’s not working now. Hail for white hat!
It works perfectly fine if you do it in a right way as shown by ELI
it does work when used correctly for seo
Automatization is a great way to work. Good for you.
Ok the email in the instal file is no longer working Aww man !! It keeps sending me a demon
Nice way to get many backlanks, this is a really working automation tool.
Hi Eli I got it finally what is actually written in the post.
I made a desktop application using the steps, will be testing it out from this weekend and let all know what are the results. Thanks for the amazing post again.
If any one wants help in what and how you should be doing, I will be glad to help. contact me( megachamp(at)megachamp(dot)com).
Thanks
Again thanks from West Coast Vinyl, this is what we were looking for with an alternative way to link building but faster.
Nice posting!
great article.
i learn many things from this article.
Thanks
Thanks a lot for sharing your ideas with us….
I get many things from this article, best information and good idia……………keep writing
For the sake of getting authentic information about Black Hole Seo. This blog is proved to be the best tool.
I noticed that it’s got quite a few private myspace blogs (definitely not splogs, or they wouldn’t be private) and quite a few posts that only contain the blog title rather than the post’s title.
ad
despite I’m not try all method, but few contents of this article really works guys.
rss xploiter does this and does it great!
tell us more
let us know more
Great idea. Was worth the try.
Thanks for sharing your idea.
Awesome tips
Thanks for a great article
The 6 methodologies shared were very clear and precise. I’ve learned a lot from them. Thank you for sharing a very informative post. Cant wait for your next blog.
The information you provided was very useful. Because of your help, thank you.
Nice to read about something interesting. I greet all
Thanks for the contribution Eli. Great and informative as always.
Excellent!
I just found this script and will give this a shot.
Going through the script to understand what it does.
Good stuff..thanks for the info!
I am looking for social bookmarking sites and sudden I found your blue hat seo the best in my searches. Thanks
These feeds are actullay your content representatives, which are easy to access, manage and published on other blogs, websites and content feed with an ease of small RSS feed setup…
hard disk recovery
I’ve been meaning to mention this site for a while now, but never did have the chance to get to it…
data recovery
This is the most “i finally found some cool trick”-feeling site I came across in a while.Times like this makes me wish i knew how to code. Have to get a friend to do it
Great new concept!
subversive legend! great post man… insane stuff.
No one is going to hand over this script to you. This is valuable stuff. If you don’t know how to code it, pay someone to do it for you. But you can’t expect everything to be handed to you on a silver platter.
Excellent! I can’t believe how much goes on under the surface of what I think of BH stuff. I can’t wait ’til part 2!
true, everybody must be do this.
First of all - thanks for an awesome blog. I’ve learned a lot from you! If you ever come to Copenhagen, I’ll be the first to take you out for a beer and some hot Scandinavian chicks :p
Waoww, i like this post omg..
One day worth of coding that can bring hundreds of hours of backlinks.
Agreed, this is going to cost me at least a days work!!!
this is the 4th blur hat SEO guide I read here, all are simply awesome Eli, we need you to write more. last week I submitted one of my blogs to about 1000 directories using an online tool. That is my new back link making method.
This made my frickin’ head explode. I can see I am going to be writing some new code in the wee hours of the morning tomorrow…
Hey Nice Web Blog good content and nice look really great. The Golden Triangle Tour is fantastic traveling package to exploring North India as well as India.
Thanks! This is perfect to start with.
This method works well!
Great new sunject!
Did anyone have any success implementing this? What about the results? Is it worth implementing it?
whoa… This one is way over my head
thanks for your posting and sharing with us….
I just added this weblog to my feed reader, great stuff. Can’t get enough!
This post sounds a little Chinese for me, but I’m learning.
lol
bağlama vefki aşk vefki
Not sure why I did not get this info before.
I think I am going to try it for now. If it give me a good result, I might change my total strategy on it. Thanks for sharing.
Looks like totally new idea. However, need to be verified.
can anyone tell me whether it is working or not?
I’ve been meaning to start a video blog for some time now
keep it up
thanx
This is awesome. although not new anymore
It’s really interesting idea.
I will second that
Cool idea that may work well with affiliate offers but I don’t think it would be a good idea with adsense
Holy Cow Eli - my head just about exploded when I read this post. I love the idea - had wondered how this worked when you mentioned it in your last post. I’m definitely going to have to get someone to design something like this for me (I’m a lover not a coder ) Anyone on here who intends to make a copy of this holler at me with a quote for a price, I’ll make it worth your while
me too xD
nice
Do follow list PR 7 Blogs SEO
It’s really interesting idea.
lol nice website chat p7bk chat egypt good
Lol don’t wanna know about it
Thanks For sharing such a useful information, Great Post.
OKKKKKKKKKKKKKKKKKKKKK
YEEEEEEEEEESSSSSSSSSS
Black hole seo, Sounds quite interesting
Btw great post Eli
its quite interesting
good Post Eli…
keep it up Eli
on a new collection of footwear for the spring Wing Shoes
Man! That is wicked, but i guess, you atleast need a good VPS to do this stuff. But seems like a really promising technique.
Thanks for this article, very interesting.
how can i forget about black hole seo? i learned this few months ago and i would say i am quite veteran in this.
black hole seo has been around for very long. this is nothing new!
Think of white hat as christian, black hat as atheist.
If you are an seo ,you might be feeling a tad conflicted about now………
thanks for your posting and sharing with us….
Agreed, this is going to cost me at least a days work!!!
Thx for the info.
Does this approach still work? I’m pretty Google may be on to it.
nice topic
thank you man
Nice Post. This post explains me very well.
thaaaaaaaaaaanks
nice intro to black hole SEO
good post but can anyone give me scripts;):)
Excellent advice!
visit MyBlogGuest, thanks for this wonderful tip
Once you got all the titles steadily coming in write a small script on your site that outputs the titles into a rolling XML feed.
Impressive! Next thing on my to do list
eeek
Really nice article, keep going!
auto transport leads will be shocked at where you find these products. They aren’t on the bottom shelves at large retail outlets they are typically hidden in specialty shops.
auto transport leads Don’t forget to include some html links inside your post content just in case their software forgets to remove it.
helpfull info thnx