Hacker News Prevents Search Engines from Spidering the Site
Here is the site’s Robots.txt. The reasons for this move are rather unclear at the moment. It might have been triggered by the site’s slowness, in an effort to decrease server loads.
Not all users of Hacker News think that this is a clever move. Quite a few of them complained that now they can’t search for older articles that were submitted to the site.
Hacker News doesn’t really rely on Google and other search engines for traffic. It’s a flourishing community that shows no signs of wanting to expand and becoming the new Digg or something similar. Will the user’s protest change anything? Remains to be seen.