Nvidia rivals focus on building a different kind of chip to power AI products
Associated PressSANTA CLARA, Calif. — Building the current crop of artificial intelligence chatbots has relied on specialized computer chips pioneered by Nvidia, which dominates the market and made itself the poster child of the AI boom. Inside an AI inference chip lab D-Matrix, which is launching its first product this week, was founded in 2019 — a bit late to the AI chip game, as CEO Sid Sheth explained during a recent interview at the company’s headquarters in Santa Clara, California, the same Silicon Valley city that’s also home to AMD, Intel and Nvidia. But I think there’s going to be a learning curve in terms of integrating it.” Feldgoise said that, unlike training-focused chips, AI inference work prioritizes how fast a person will get a chatbot’s response. Sheth says the big concern right now is, “are we going to burn the planet down in our quest for what people call AGI — human-like intelligence?” It’s still fuzzy when AI might get to the point of artificial general intelligence — predictions range from a few years to decades. “They cannot be put on the same path.” The other set of companies don’t want to use very large AI models — it’s too costly and uses too much energy.