Kheng Ho Toh/123RF

Google Challenges Nvidia In Artificial Intelligence Processor Market

  • 2 August 2017
  • Cas Proffitt

Artificial intelligence processors have become an increasingly competitive market over the last couple of years. Google has taken their AI processor development to a new level, moving away from graphics processing units (GPUs) and to their own Tensor processing unit (TPU) in the cloud that supports their open source AI programming language called TensorFlow.

Many have expressed concerns about the potential of Google using their cloud-based TPU and TensorFlow to trap AI developers into their cloud services by leveraging the cost of re optimizing their code for a different type of processor and language.


What is the difference between Nvidia and Google AI processors?

Nvidia is a well known developer and manufacturer of GPUs for gaming applications. In the AI development race, GPUs have been heavily used due to their ability to process data sets much more quickly. Their most recent AI GPU is reported to be capable of 120 teraflops worth of operations per second.


Google has been developing custom chips and other components for years to drive their analytics and AI development research. Their new cloud TPU is reported to process 180 teraflops per second, using four chips.


How will proprietary hardware and programming languages impact the AI development space?

Proprietary hardware and software have a unique way of impeding progress across industries. One of the most significant costs associated with proprietary development is spending resources multiple times across multiple companies to establish viable development platforms. Many developers are concerned about Google using their higher efficiency platform to lure and trap developers into using their TensorFlow language and cloud-based TPUs.


What are some benefits to a commercially available AI development platform?

While Google is using a proprietary chipset, they are also making the technology available to the public. Benefits to making the technology publically available inlcude allowing developers to train and develop artificial intelligences without having to front the cost of equipment and other aspects of platform development.



Artificial intelligence is becoming increasingly important in nearly every corner of industry. In order to adapt to the expanse of artificial intelligence, hardware and software developers are continuing to develop newer and more efficient ways to train and operate them.



Google choosing to make their TPU available to the public increases the ability of developers to work with and develop artificial intelligences for a wide variety of uses. By making their TPU units commercially available through their cloud architecture, Google is both opening a market trap and helping to push AI development further.

About Cas Proffitt

Cas is a B2B Content Marketer and Brand Consultant who specializes in disruptive technology. She covers topics like artificial intelligence, augmented and virtual reality, blockchain, and big data, to name a few. Cas is also co-owner of an esports organization and spends much of her time teaching gamers how to make a living doing what they love while bringing positivity to the gaming community.