http://glinden.blogspot.com/2009/10/advice-from-google-on-large-distrib…
A standard Google server appears to have about 16G RAM and 2T of disk. If we assume Google has 500k servers (which seems like a low-end estimate given they used 25.5k machine years of computation in Sept 2009 just on MapReduce jobs), that means they can hold roughly 8 petabytes of data in memory and, after x3 replication, roughly 333 petabytes on disk. For comparison, a large web crawl with history, the Internet Archive, is about 2 petabytes and "the entire [written] works of humankind, from the beginning of recorded history, in all languages" has been estimated at 50 petabytes, so it looks like Google easily can hold an entire copy of the web in memory, all the world's written information on disk, and still have plenty of room for logs and other data sets. Certainly no shortage of storage at Google.