Tech

Why OpenAI, Meta, Google, and More Rely on GPUs to Build Powerful AI Bots Like ChatGPT, Gemini (Easy to Understand Explained)


It would be an understatement to say that AI Model have taken the world by storm. They’re everywhere—in your phone, the car you drive, the washing machine you use, and even the video games you play. It all started with the launch of ChatGPT by OpenAI, and since then, many other major tech companies, including MetadataGoogle and Anthropic have launched their own AI models and then bots based on them. But why are only a select few companies able to create these AI bots? Is it just research? The answer is no!

AI companies need a lot of capital, and they also need access to massive amounts of computing power to train their respective models. Now, where does the “compute” come from? Graphics processing units, or GPUs, as you know them. That’s right, the same hardware you might use to power some video games on your computer.

This is precisely why NVIDIA has seen exponential growth, becoming the number one company in the world by market capitalization, even beating out Apple and Microsoft. The company continues to change positions and is now comfortably in the top five.

So, why do AI companies need GPUs? Let’s answer this question in this explainer.

Read more: Motorola Edge 50 AI Features: Find Out What Advanced Features the Smartphone Has to Offer

GPUs and AI: What’s the Connection?

Short answer: GPUs are just better at computation and have better power efficiency than CPUAnd over time, the performance of the product will increase with the amount of money you spend, making them an ideal choice.

Long answer: NVIDIA, which makes GPUs for OpenAI, Meta, Oracle, Tesla, etc., says current GPUs can do parallel processing, scale to supercomputer levels, and the overall software stack for AI is pretty broad.

And this is exactly why market leaders like OpenAI have trained their Large Language Models using thousands of NVIDIA GPUs. In simple terms, these AI models are essentially neural networks, according to NVIDIA, and are created using layers of equations, with one part of the data related to another. This is where the massive computational power that GPUs have comes into play. GPUs, with their thousands of cores (think of them as computers), “solve” the layers of complexity that make up an AI model.

This ability to process multiple layers simultaneously in real time is what makes GPUs ideal for training AI. Think of a GPU as a giant ice shaver—a means of shaving a large block of ice (an AI model) into millions of tiny crystals and turning it into something you can eat.

Read more: Japanese Toilets in India: Starting Price, Features and Every Need-to-Know Details of TOTO Washlet

AI Tech Giants Are Busy Buying Powerful GPUs

Just recently, NVIDIA provided the H200 GPU to OpenAI to develop GPT-5 and eventually create AGI or Artificial General Intelligence. The H200 GPU is considered to be the fastest GPU and the CEO of NVIDIA, Jensen Hoangpersonally “handed over” the world’s first DGX-H200 to OpenAI earlier this year, in April.

More recently, Jensen Huang and Mark Zuckerberg shared the stage together for a livestream where they discussed a variety of AI topics, including how open-source AI models make AI accessible to millions of people. Additionally, Zuckerberg even said that in the future, everyone will have their own AI model. But keeping the conversation low-key, Meta is already a major NVIDIA customer and has a strong relationship with the company. The company is said to have purchased around 3,50,000 NVIDIA H100 GPUs for its Meta AI and Llama ambitions. By the end of 2024, Meta expects to have a total of 6,00,000 GPUs.

Read more: iPhone users will soon get Telegram-like animated emojis for WhatsApp, here’s everything we know

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button