alecco 24 minutes ago

> We believe they have access to around 50,000 Hopper GPUs, which is not the same as 50,000 H100, as some have claimed."

Source: trust me bro, an affected Silicon Valley AI CEO said so.

> As history shows, a small well-funded and focused startup can often push the boundaries of what’s possible. DeepSeek lacks the bureaucracy of places like Google, and since they are self funded can move quickly on ideas.

But Google's multiple AI labs invented most of these things from TPU (now Tensor Cores) to the Transformer. And Deepmind has the best model at the moment Gemini 2.0 Flash. The difference is they don't publish weights and became reclusive thanks to OpenAI's sociopathic behavior against Google.