• snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    60
    ·
    18 hours ago

    Or if you are set on using AI Overviews to research products, then be intentional about asking for negatives and always fact-check the output as it is likely to include hallucinations.

    If it is necessary to fact check something every single time you use it, what benefit does it give?

    • Feyd@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 hours ago

      That is my entire problem with llms and llm based tools. I get especially salty when someone sends me output from one and I confirm it’s lying in 2 minutes.

    • brsrklf@jlai.lu
      link
      fedilink
      English
      arrow-up
      15
      ·
      edit-2
      17 hours ago

      None. It’s made with the clear intention of substituting itself to actual search results.

      If you don’t fact-check it, it’s dangerous and/or a thinly disguised ad. If you do fact-check it, it brings absolutely nothing that you couldn’t find on your own.

      Well, except hallucinations, of course.

    • artyom@piefed.social
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      13 hours ago

      It hasn’t stopped anyone from using ChatGPT, which has become their biggest competitor since the inception of web search.

      So yes, it’s dumb, but they kind of have to do it at this point. And they need everyone to know it’s available from the site they’re already using, so they push it on everyone.

    • TXL@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      4
      ·
      14 hours ago

      It might be able to give you tables or otherwise collated sets of information about multiple products etc.

      I don’t know if Google does, but LLMs can. Also do unit conversions. You probably still want to check the critical ones. It’s a bit like using an encyclopedia or a catalog except more convenient and even less reliable.

      • Feyd@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        8 hours ago

        You can do unit conversions with powertoys on windows, spotlight on mac and whatever they call the nifty search bar on various Linux desktop environments without even hitting the internet with exactly the same convenience as an llm. Doing discrete things like that with an llm inference is the most inefficient and stupid way to do them.

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        14 hours ago

        Google had a feature for converting units way before the AI boom and there are multiple websites that do conversions and calculations with real logic instead of LLM approximation.

        It is more like asking a random person who will answer whether they know the right answer or not. An encyclopedia or catalog at least have some time of a time frame context of when they were published.

        Putting the data into tables and other formats isn’t helpful if the data is wrong!