So what did DeepSeek do that deep-pocketed OpenAI didnt?
By comparison, OpenAI CEO Sam Altmansaid that GPT-4cost more than $100 million to train.
DeepSeek didnt invent most of the optimization techniques it used.
DeepSeek’s latest AI models outperform OpenAI’s on some advanced tasks and were much cheaper to build.© CFOTO/Getty Images
Some, like using data formats thatuse less memory, have been proposed by its bigger competitors.
The company hasnt said how exactly it did that.
DeepSeek, on the other hand, laid out its process.
The results of the pure reinforcement learning approach werent perfect.
The R1-Zero models outputs were sometimes difficult to read and switched between languages.
Doing more with less underpins the approach taken at several Chinese state-funded labs.
News from the future, delivered to your present.
Two banks say Amazon has paused negotiations on some international data centers.