Crawling Your Slower Pages- Google Block ?

Google’s head of web spam, Matt Cutts put out a new video discussing site speed’s impact on rankings. This is not the first time Cutts has addressed the issue, but it’s a somewhat different take than we’ve seen before, as it’s in direct response to the following user-submitted question:

You mentioned that site speed is a factor in ranking. On some pages, our site uses complex queries to return the users request, giving a slow pagetime. should we not allow Googlebot to index these pages to improve our overall site speed?

“I would say, in general, I would let Googlebot crawl the same pages that users see,” says Cutts. “The rule of thumb is this. Only something like 1 out of 100 searches are affected by our page speed mechanism that says, things that are too slow rank lower. And if it’s 1 out of a 100 searches, that’s 1 out of roughly 1,000 websites. So if you really think that you might be in the 1 out of 1,000, that you’re the slowest, then maybe that’s something to consider.”

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s