Google Reveals Custom-Built Processor for Its AIs
Early this year, Google’s AI, named AlphaGo, beat the world’s top Go player Lee Sedol. Behind AlphaGo’s success is the TPU, a custom-built processor called Tensor Processing Unit. According to Norm Jouppi, hardware engineer at Google, the TPUs used in AlphaGo enable it to analyze much faster and anticipate farther ahead between moves.
TPU is also the hardware behind the success of Google’s RankBrain and Street View. RankBrain is the AI used by Google to improve the relevancy of its search results; Street View, meanwhile, is the AI used to improve the quality and accuracy of Google maps and navigation.
According to Jouppi, Google has been running TPUs inside its data centers for close to two years. Jouppi described it as a custom-built processor designed specifically for machine learning. TPUs proved to “deliver an order of magnitude better-optimized performance per watt for machine learning,” the hardware engineer at Google said.
Jouppi added that the capability of TPU is “roughly equivalent to fast-forwarding technology about seven years into the future (three generations of Moore’s Law).”
“Building TPUs into our infrastructure stack will allow us to bring the power of Google to developers across software like TensorFlow and Cloud Machine Learning with advanced acceleration capabilities,” the Google hardware engineer said.
TensorFlow is Google’s open source tool for machine learning. Cloud Machine Learning, on the other hand, is the service offered by Google for developers to easily build machine learning models, regardless of any type of data or any data size.