• loathsome dongeater@lemmygrad.mlOP
      link
      fedilink
      English
      arrow-up
      9
      ·
      16 hours ago

      Additionally, Google’s generative AI stuff is exceptionally half baked. To such an extent that it seems impossible for a megacorporation of the calibre of Google. There has already been a ton of coverage on it. Like the case of LLM summary of a Google search suggesting you put inedible things on a pizza and their image generators producing multiracial Nazis.

    • loathsome dongeater@lemmygrad.mlOP
      link
      fedilink
      English
      arrow-up
      13
      ·
      19 hours ago

      It’s not uncommon for these things to glitch out. There are many reasons but I know of one. LLM output is by nature deterministic. To make them more interesting their designers instill some randomness in the output, a parameter they call temperature. Sometimes this randomness can cause it to go off the rails.