Avatar from Dicebear.

  • 1 Post
  • 92 Comments
Joined 1 month ago
cake
Cake day: September 14th, 2025

help-circle









  • In fact, according to The Register, the GPU computing performance of the GB10 chip is roughly equivalent to an RTX 5070. However, the 5070 is limited to 12GB of video memory, which limits the size of AI models that can be run on such a system. With 128GB of unified memory, the DGX Spark can run far larger models, albeit at a slower speed than, say, an RTX 5090 (which typically ships with 24 GB of RAM). For example, to run the 120 billion-parameter larger version of OpenAI’s recent gpt-oss language model, you’d need about 80GB of memory, which is far more than you can get in a consumer GPU.

    Or you could’ve just made GPUs, and then we’d all be gaming and calling each other shitheads in Valorant instead of - checks notes - literally stealing the water from poor communities.











  • “It’s almost certain” that AI will reach that level eventually, one researcher told Nature.

    Semafor doing so much work trying the launder this into a story. “One scientist” in the original article, to multiple scientists in their headline.

    This is the first of three waves of AI in science, says Sam Rodriques, chief executive of FutureHouse — a research lab in San Francisco, California, that debuted an LLM designed to do chemistry tasks earlier this year.

    And the one “scientist” seems to have switched tracks from doing actual research to doing capitalism.