2 months, 1 week ago

Mint Primer | DeepSeek: A Chinese marvel or OpenAI copy?

DeepSeek has challenged big tech, proving AI can be efficient without costly graphics processing units or massive data centres. AI lab DeepSeek shocked big tech by training its open-source R1 model on Nvidia’s lower-capability H800 chips for under $6 million—far less than the billions spent on OpenAI’s ChatGPT or Google’s Gemini. DeepSeek’s low-cost, energy-efficient, open-source AI could democratize access, challenging Microsoft, Google, Meta and Nvidia while proving advanced AI can be built without a huge outgo. Distillation implies that DeepSeek may have used OpenAI’s outputs as “teacher" data to train its own AI, reducing costs and development time—violating OpenAI’s licence terms, and raising concerns about originality, ethics, and intellectual property rights. DeepSeek’s R1 proves AI can be built affordably, much like how Altman’s efforts encouraged Indian startups to develop small language models for under $10 million.

Live Mint

Discover Related