Home News Insight Builder Reviews Jobs Downloads newsletters
Insight:   Hardware  |  Software  |  Security  |  Communications  |  Business  |  Commentary  |  Archive
ADSL firewall routers.   |   Unlocking PC secrets   |   Fixing Firefox   |   Security certification   |   CNET.com.au

Page II: Google's vice-president of engineering was in London this week to talk to potential recruits about just what lies behind that search page.

Obviously it would be impractical to run the algorithm once every page for every query, so Google splits the problem down.

When a query comes in to the system it is sent off to index servers, which contain an index of the Web. This index is a mapping of each word to each page that contains that word. For instance, the word 'Imperial' will point to a list of documents containing that word, and similarly for 'College'. For a search on 'Imperial College' Google does a Boolean 'AND' operation on the two words to get a list of what Hölzle calls 'word pages'.

"We also consider additional data, such as where in the page does the word occur: in the title, the footnote, is it in bold or not, and so on.

Each index server indexes only part of the Web, as the whole Web will not fit on a single machine -- certainly not the type of machines that Google uses. Google's index of the Web is distributed across many machines, and the query gets sent to many of them -- Google calls each on a shard (of the Web). Each one works on its part of the problem.

Google computes the top 1000 or so results, and those come back as document IDs rather than text. The next step is to use document servers, which contain a copy of the Web as crawled by Google's spiders. Again the Web is essentially chopped up so that each machine contains one part of the Web. When a match is found, it is sent to the ad server which matches the ads and produces the familiar results page.

Google's business model works because all this is done on cheap hardware, which allows it to run the service free-of-charge to users, and charge only for advertising.

The hardware
"Even though it is a big problem", said Hölzle, "it is tractable, and not just technically but economically too. You can use very cheap hardware, but to do this you have to have the right software."

Google runs its systems on cheap, no-name IU and 2U servers -- so cheap that Google refers to them as PCs. After all each one has a standard x86 PC processor, standard IDE hard disk, and standard PC reliability -- which means it is expected to fail once in three years.

On a PC at home, that is acceptable for many people (if only because they're used to it), but on the scale that Google works at it becomes a real issue; in a cluster of 1,000 PCs you would expect, on average, one to fail every day. "On our scale you cannot deal with this failure by hand," said Hölzle. "We wrote our software to assume that the components will fail and we can just work around it. This software is what makes it work.

One key idea is replication. "This server that contains this shard of the Web, let's have two, or 10," said Hölzle. "This sounds expensive, but if you have a high-volume service you need that replication anyway. So you have replication and redundancy for free. If one fails you have 10 percent reduction in service so no failures so long as the load balancer works. So failure becomes and a manageable event."

In reality, he said, Google probably has "50 copies of every server". Google replicates servers, sets of servers and entire data centres, added Hölzle, and has not had a complete system failure since February 2000. Back then it had a single data centre, and the main switch failed, shutting the search engine down for an hour. Today the company mirrors everything across multiple independent data centres, and the fault tolerance works across sites, "so if we lose a data centre we can continue elsewhere -- and it happens more often than you would think. Stuff happens and you have to deal with it."

A new data centre can be up and running in under three days. "Our data centre now is like an iMac," said Schulz." You have two cables, power and data. All you need is a truck to bring the servers in and the whole burning in, operating system install and configuration is automated."

Working around failure of cheap hardware, said Hölzle, is fairly simple. If a connection breaks it means that machine has crashed so no more queries are sent to it. If there is no response to a query then again that signals a problem, and it can cut it out of the loop.

That is redundancy taken care of, but what about scaling? The Web grows every year, as do the number of people using it, and that means more strain on Google's servers.

Forward in E-mail this story! Format for Printer Friendly
Related stories
  Google's man behind the curtain

  Coming soon: Google TV?

Tell us your opinion
  Talkback: Post your comment here
Can't resist being a little PC and finding the parallel betw... Anonymous
 
Very nice overview. Our compute farm has no direct correlat... Miles O'Neal
 
Nice article. Always nice to get an "inside" look at one of ... Anthony Papillion
 
Thanks for the story. More interesting insights on the Googl... Anonymous
 
This was a really cool article. Thanks... Anonymous
 
Hey, a "give me less commercial" button sounds wonderful. If... Anonymous
 
How have you guys not done a story on the Google Sandbox??!... Russell Taylor
 
Klingon and Tagalog? I'm curious if you were aware that Taga... Anonymous
 
Why is Tagalog placed in the same context as Klingon? It's h... Anonymous
 
Very cool article. Great insight on how the website works. N... Anonymous
 
THE PAGE IS HANGING! In both IE and Firefox!... Anonymous
 
Congrats to you and Matt Loney; good stuff. Got onto you fr... Anonymous
 
Spelling and grammar make this article a pain, although the ... J Stroud
 
Jesus, i never knew any of this, its really inresting, what ... Mike
 
Get a proofreader. 10-15 was supposed to be 10 to the power ... Anonymous
 
What amazes me is that with 200 computer doctors and 600 oth... Wily Elder
 
Uh...Google keeps locking up....(Just Kiddin') Have A Mer... Jay Paul
 
"104 interface languages including Klingon and Tagalog." ... Anonymous
 
umm interesting article... Nesta
 
Fran Foo
Is open source ready for big business?
Register for exclusive content and special offers.
The end of e-mail?
Uncloaking the US Patriot Act
BPM: Bridging the digital divide
Bigger phishes ready to spawn
BigPond savages Whirlpool broadband survey
PM launches silicon chip centre
Australia 'dirty dozen' spammer
Critical Windows patch on the way
Wireless
Process Improvement
Servers
E-mail
CRM
Weekly Insight
IT in Government
Enterprise Storage
seek
Tech Job Search
seek
seek seek
seek
seek
 Keyword (optional):
 
 Or use our full Job Search
Powered by SEEK


 Sponsored Links
Try out a ThinkPad   Test-drive an award-winning ThinkPad. Learn more
Sony VAIO FS   The open and shut case for your business. Notebooks from $2499.
HP Red Hot Deals   Desktops from $999 and Notebooks from $1495!
Canon   imageRUNNER - 10,000 colour prints FREE*. Click for more info.
McAfee AntiSpyware   Try it free for 30 days!
 Featured Links
E-mail management  Outsourcing e-mail management could be the way of the future.
Who needs colour?  6 mono printers tested
PDA round up  We tested 6 of latest mobile devices.
Free Download  Want an Adware & Spyware Removal?
Home News Insight Builder Reviews Jobs Downloads Newsletters
Security & Privacy Policy | Terms of Use | Advertise | Contact | About Us | Site Map
Copyright © 2005 CNET Networks, Inc. All rights reserved. ZDNet is a registered service mark of CNET Networks, Inc. ZDNet Logo is service mark of CNET Networks, Inc.