• snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    51
    ·
    2 months ago

    Or if you are set on using AI Overviews to research products, then be intentional about asking for negatives and always fact-check the output as it is likely to include hallucinations.

    If it is necessary to fact check something every single time you use it, what benefit does it give?

    • Feyd@programming.dev
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 months ago

      That is my entire problem with llms and llm based tools. I get especially salty when someone sends me output from one and I confirm it’s lying in 2 minutes.