• 0 Posts
  • 15 Comments
Joined 1 year ago
cake
Cake day: October 23rd, 2024

help-circle




  • This is not credible. A self promoting stock pump and dump PR. Vision AI models are smaller than text models. They do need fast/faster GPUs, but less memory. Very narrow purposed AI/Neural Network models need less memory because the memory is more about storing facts than logic/reasoning capability. LLM breakthroughs in benchmark score/GB are currently having more gains by smaller models than frontier largest models. 32gb is a reasonable ceiling for memory requirement. Robots can swap in task specific AI models as well.


  • A big, stupid magic trick: As these companies get desperate, expect someone — especially OpenAI — to try and show something “new and crazy” as a means of trying to turn the narrative. When or if this happens, look very carefully at what they say about the product’s availability, or what it can do, or who they show it to.

    Alternative scenario — Sora launches: If OpenAI gets desperate, it may move up the public launch of Sora, its generative video product. Doing so will only cause more problems — there isn’t a chance in hell that Sora is profitable, and I’m fairly sure it’s even more expensive to run than ChatGPT, and I imagine its visual inconsistencies and hallucinations would make for some entertaining content for YouTubers and tech reporters.

    It was a horseman, just not the part where they cancel it a few months later.

    To be fair to OpenAI, they are now saying “just use ChatGPT for video too”. Good chance that Disney cancelled them before they cancelled Sora