Friday, January 5, 2007

Step-by-Step: How to Get BILLIONS of Pages Indexed by Google

As most SEOs know, MSN loves the subdomains. You can make hundreds of keyworded subdomains and MSN will think quite highly of the pages. Same goes for blogspot and other blogs– they do very well on MSN and sometimes on Yahoo. Now Google and the new BigDaddy crawler is showing an even more idiotic preference when indexing and ranking subdomains.

Check out this site: search of eiqz2q.org — depending which datacentre you hit, you will see between 3.8 and 5.5 BILLION RESULTS. Even worse… the domain is EIGHTEEN DAYS OLD. That’s right, in under 3 weeks, one person has managed to get one domain 5 billion pages indexed in Google. And they are ranking, too. That particular domain has an Alexa ranking of under 7,000. Another domain owned by the same person, t1ps2see.com, has between 1.7 and 2.4 billion indexed pages and an Alexa ranking of under 2,000… after 4 weeks. Coincidentally, the sites also have 3 blocks of Adsense ads on each page. I wonder how much that one person is earning per day with billions and billions of pages indexed and ranking?

5 billion indexed pages
Inspired by his work, I present, The Step-by-Step Guide to Getting Billions of Pages Indexed by Google:

•Register a meaningless domain consisting of numbers, letters, and secret symbols. Heck, register a hundred of them.

•Setup a server to manage all of your domains and subdomains. It will need to be beefy as you will be serving a lot of traffic in a few days.

•Buy as many article databases as you can. Topic doesn’t matter. You might want to search and replace some i’s and 0’s for corresponding ASCII codes to help you avoid duplicate content.

•Create or buy a common scraper script. You’ll need it to respond with different articles based on what keyword is hit, effectively serving up new content for each subdomain. It should respond to any subdomain query. Your server should be setup to allow all subdomains to be redirected to your main page; there your script sorts out what content to serve. Effectively allowing you to create an infinite number of subdomains with unique content. Now the trick this guy is using involves subdomains of subdomains. So you create a “topical” subdomain such as music.3hid9gw.org and then on that subdomain you create your actual pages of additional sub-subdomains, like: 2152.music.3hid9gw.org. Because each subdomain and each sub-subdomain is considered a new site by Google, you can get past the “1 page indexed per site” delay for new domains. If you don’t get this part, hire someone… according to the Alexa traceback, it looks like Argentina has the right people for the job.

•Launch your blog comment spam attack. Link to some of your subdomains which are also interlinked.

•Wait a few weeks… then sit back and enjoy your billions of indexed pages. Be sure to put 3 Adsense or YPN blocks on each page.

•Want proof? The spammer with billions of indexed pages definitely moves traffic and now that some domains have been de-indexed, watch the Alexa ranking fall like a rock.

Thanks to Nintendo at DP for pointing out how extensive this network of subdomain spam has become. If there is enough interest shown, I’ll work up a script that would do this and post it for free so we can all enjoy the benefits of subdomain indexing.

UPDATE: This outing made the front page of Digg and del.ici.ous, and is currently the top story at Techmeme and Reddit. Due to the publicity, you can watch the indexed pages of these domains shrink right before your eyes. The current site: count for eiqz2q.org is 700,000,000 down from 5,550,000,000. I assume it will be 0 in a matter of hours. While a hand job will work for now, when will the indexing algorithm be repaired?


Sursa: Monetize

Link:
Google Gets Spam Punk'd


Avatars
Gallery

No comments: