Meet ‘Groq,’ the AI Chip That Leaves Elon Musk’s Grok in the Dust

Meet ‘Groq,’ the AI Chip That Leaves Elon Musk’s Grok in the Dust

Groq, an AI chip company, wants everyone to forget about Elon Musk’s snarky chatbot with nearly the same name, Grok. Lightning-fast demos from Groq went viral this weekend, making current versions of ChatGPT, Gemini and even Grok look sluggish. Groq claims to provide “the world’s fastest large language models,” and third-party tests are saying that claim might hold up.

In a split second, Groq produces hundreds of words in a factual answer, citing sources along the way, according to a demo posted on X. In another demo, founder and CEO Jonathon Ross let a CNN host have a real-time, verbal conversation with an AI chatbot halfway across the world on live television. While ChatGPT, Gemini, and other chatbots are impressive, Groq could make them lightning-fast. Fast enough to have practical use cases in the real world.

 

Groq’s AI Chip Breaks Speed Records

Groq creates AI chips called Language Processing Units (LPUs), which claim to be faster than Nvidia’s Graphics Processing Units (GPUs). Nvidia’s GPUs are generally seen as the industry standard for running AI models, but early results show that LPUs might blow them out of the water.

Groq is an “inference engine,” not a chatbot like ChatGPT, Gemini, or Grow. It helps these chatbots run incredibly fast but does not replace them altogether. On Groq’s website, you can test out different chatbots and see how fast they run using Groq’s LPUs.

Groq produces 247 tokens/second compared to Microsoft’s 18 tokens/second, according to a third-party test from Artificial Analysis published last week. That means ChatGPT could run more than 13x as fast if it was running on Groq’s chips.

AI chatbots like ChatGPT, Gemini, and Grok could be significantly more useful if they were faster. One current limitation is that these models can’t keep up with real-time human speech; some delays make conversations feel robotic. Google recently faked its Gemini demo to make it look like Gemini could have a real-time, multi-modal conversation, even though it can’t. But with Grok’s increased speeds, that video could be a reality.

Before Groq, Ross co-founded Google’s AI chip division, which produced cutting-edge chips to train AI models. With LPUs, Ross says Groq bypasses two LLM bottlenecks that GPUs and CPUs get stuck on: compute density and memory bandwidth.

The name Grok comes from Stranger in a Strange Land, a 1961 science fiction book by Robert Heinlein. The word means “to understand profoundly and intuitively.” That’s the reason so many AI companies are using it to describe their AI products.

Not only is there Ross’s Groq and Elon Musk’s Grok, but there’s also an AI-enabled IT company named Grok. Grimes also has an AI-powered toy, Grok, supposedly named after the way she and Musk’s children say “Grocket.” However, Ross claims his Groq was first in 2016.

“Welcome to Groq’s Galaxy, Elon” said a November blog post from Ross, three days after Elon Musk released xAI’s version of Grok. “You see, I am the founder and CEO of the company called Groq,” said Ross, making sure to include that Groq is a trademarked name.

While Groq is receiving a lot of buzz, it remains to be seen if its AI chips have the same scalability as Nvidia’s GPUs or Google’s TPUs. AI chips are a major focus for OpenAI CEO Sam Altman these days, who is even considering building them himself. Groq’s increased chip speeds could jumpstart the AI world, creating new possibilities for real-time communication with AI chatbots.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.