As you’re probably well aware, Google has been overridden with spam for a while lately. It seems some guy from Moldova created a script on his ad-laden site that caused the server to generate billions of subdomains upon being visited by a google-bot. Since Google treats sub-domains as regular domains (something that shouldn’t change because of this, more on that below), the pages got indexed fully. Google is currently in the process of cleaning up the wreckage.
The problem doesn’t stem from subdomains, it stems from the spambots he initially deployed to leave messages on messagebaords and blog comments to create links to his site (making his sites seem that much more popular to the google-bots). Google has already admitted that Link Popularity is no longer a reliable way to gauge a site’s value, due to spam tactics like this, and will be valuing it less in future updates to the engine.
There must be a better way of determining a spamsite than by just looking for the existence of a subdomain. Many sites (including this one, but also biggies likewikipedia, slashdot, pretty much every major site on the web) use and rely upon subdomains to categorize content. So these site owners are faced with a challenge: change organization methods to maintain inclusion, or drop from the search engines (could you imagine wikipedia on page 50?).
Google-bot, listen to me! Bring this message back to your creators! For the love of sanity, don’t punish subdomains just for being subdomains! You’ve got some really smart people there, some of the smartest in the world…you can find another solution!