NVIDIA dominating the AI Chips
Its cat related
NVIDIA, one of our favorite brands!
How NVIDIA veered from gaming and graphics hardware to ruling AI chips had allegedly something to do with cats. It all happened circa 2010, when Bill Dally, now chief scientist at NVIDIA was having breakfast with Andrew Ng, an erstwhile associate at Stanford University who was working on a project with Google.
Working at the Google X Lab, Ng was trying to build a neutral network that encompassed the ability to learn on its own. The neutral network learned to distinguish human faces, bodies, and cats from a presentation of ten million YouTube videos. However, in order to be able to do this accurately, the system needed thousands of central processing units or CPUs. In the conversation, Dally bet that systems will be able to do just that by using only a few graphics processing units or GPUs since these have massive processing capabilities that make them better at AI tasks than CPUs.
Dally then passed on the idea to Bryan Catanzaro, who now heads the deep learning research lab at NVIDIA. Catanzaro was able to do the feat with only 12 GPUs, confirming that GPUs was faster and more efficient at teaching the cat-recognition model than CPUs. Catanzaro, however, emphasizes that NVIDIA did not just accidentally discover success into the AI market, its success was brought about by strategic planning and implementation.
NVIDIA’s competitors catching up
As AI is used more and more by businesses to interpret the tons of data their systems process and governments invest in deep learning research so as to compete with others, we need super AI chips that work better and faster! Thus, competitors are lining up at the heels of NVIDIA and starting to catch up with their innovations. Google has started to produce their own chips; Amazon has its AWS Inferentia chips taking charge of Alexa’s brain; Qualcomm is holding its own with Cloud AI 100; and IBM is designing an energy-efficient AI chip.
The rivals of NVIDIA argue that the company has dominated the industry only because they have done a wonderful job at hiding the complexity of a GPU through meticulous optimization and complex layers of software. Forget GPUs, they add, because one can make an entirely new AI chip architecture from scratch. To the market, Google offers its TPU’s which are application-specific integrated circuits, Cerebras has a Wafer-Scale Engine, while IBM and BrainChip provides neuromorphic chips.
To this, Catanzaro counters that many chips being offered in the market are just hardware that boosts AI or so-called AI accelerators.
Dominating the market – Gamechanger
NVIDIA makes sure they are at the top of the game with upgrades that annually bested Google’s TPUs.
Moreover, NVIDIA is bent on maintaining their dominance and made a US$40 billion acquisition bid for ARM, a British chip designer who licenses the intellectual property of the AI chips to other companies to use.
Competitors are concerned that if NVIDIA acquires ARM, it would limit those partnerships to which Huang responds that NVIDIA would respect ARM’s open model. It also means that the deal would have a massive impact on the shape of the market. NVIDIA would then definitely rule the data centre side with its GPUs and have the edge with help from ARM.
In the last year, NVIDIA has been building the largest and most powerful supercomputer in the UK, called Cambridge-1 using 80 DGX A100 boxes. Designed to be relatively plug-and-play, the DGX is a full AI computer, complete with memory and networking and everything else.
Cambridge-1 is the only supercomputer that NVIDIA will open up to external partners who are universities and healthcare giants like AstraZeneca, Oxford Nanopore and GSK that can run their own deep learning models.
At present, NVIDIA has the fifth largest supercomputer in the world, Selene.
The future of AI’s deep-learning models
Dally admits that there is a possibility that deep-learning models developed in the future may no longer work on GPUs, but it is improbable as most researchers are working on GPUs.
On the other hand, others would disagree arguing that GPUs may be inhibiting deep learning models from their full potential. The computing demands of machine learning, according to a study, were far surpassing hardware improvements or model training efficiencies. This will eventually, make developing systems costly, financially and environmentally.
To solve this, researchers may need to build more efficient models and make use of what they already have. One idea from Neil Thompson, a researcher at MIT’s computer Science and AI Lab, is to take a “methodical about data” approach and only putting it against related parameters. Another idea is refining what we learn from models into more lightweight equations which means running only an important part of a model rather than a massive universal one.
AI shouldn’t be limited to those who can build or afford a supercomputer. Eitan Medina, CEO at Habana Labs advocates that in order for AI to be affordable by anyone, that price performance must be improved.
Even as these challenges are met, AI chips will continue to evolve. In the future, AI will touch every aspect of our lives as it starts to take over management of our fridges and coffee makers, for example. In just a few years, every appliance, gadget, equipment, and even processes will be embedded with AI.
- NVIDIA’s dominance in the AI industry started with teaching a neutral network to recognize cats using only 12 GPUs.
- GPUs are faster and more efficient at teaching the cat-recognition model than CPUs.
- Although NVIDIA had a head start, competitors are beginning to catch up.
- The rivals of NVIDIA argue that the company has dominated the industry only because they had done a wonderful job at hiding the complexity of a GPU through meticulous optimization and complex layers of software.
- NVIDIA continues and strives to become a glowing star in the AI industry.
Jaclyn-Mae Floro, BCompSc
Contact W3IP Law on 1300 776 614 or 0451 951 528 for more information about any of our services or get in touch at email@example.com.
Disclaimer. The material in this post represents general information only and should not be taken to be legal advice.