• Breezy@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      5 days ago

      Well theyd be able to say how to make a bomb. Or kill yourself effectively. AI ceos dont even care what their systems can do. If some customers die thats okay to them, it shows how intelligent their ai is. And thats a statement from one of the big AI CEOs.

      • porkloin@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        5 days ago

        I don’t think those are the categories where most people are finding LLMs frustrating. We keep being told human white collar work is on the precipice of being replaced, but LLMs continue to be really inconsistent. Failing to parrot easily retrievable info like how to build a legally restricted thing or off yourself isn’t what people are finding lacking it’s that half the time it does something sorta correctly and the other half of the time it lies, fucks up, or fucks up and then lies about it.