• Kecessa@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    18 days ago

    Pigeon = edible bird

    Cleaning a bird > preparing a bird after killing it (hunting term)

    AI figured the “rescued” part was either a mistake or that the person wanted to eat a bird they rescued

    If you make a research for “how to clean a dirty bird” you give it better context

          • Rivalarrival@lemmy.today
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            17 days ago

            My point wasn’t that LLMs are capable of reasoning. My point was that the human capacity for reasoning is grossly overrated.

            The core of human reasoning is simple pattern matching: regurgitating what we have previously observed. That’s what LLMs do well.

            LLMs are basically at the toddler stage of development, but with an extraordinary vocabulary.

        • HighlyRegardedArtist@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          18 days ago

          I have to disagree with that. To quote the comment I replied to:

          AI figured the “rescued” part was either a mistake or that the person wanted to eat a bird they rescued

          Where’s the “turn of phrase” in this, lol? It could hardly read any more clearly that they assume this “AI” can “figure” stuff out, which is simply false for LLMs. I’m not trying to attack anyone here, but spreading misinformation is not ok.

          • Kecessa@sh.itjust.works
            link
            fedilink
            arrow-up
            0
            ·
            18 days ago

            I’ll be the first one to explain to people that AI as we know it is just pattern recognition, so yeah, it was a turn off phrase, thanks for your concern.

            • HighlyRegardedArtist@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              18 days ago

              Ok, great to know. Nuance doesn’t cross internet well, so your intention wasn’t clear, given all the uninformed hype & grifters around AI. Being somewhat blunt helps getting the intended point across better. ;)

    • FlorianSimon@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      18 days ago

      I like how you’re making excuses for something that it is very clear in context. I thought AI was great at picking up context?

      • iAmTheTot@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        18 days ago

        I don’t think they are really “making excuses”, just explaining how the search came up with those steps, which what the OP is so confused about.

      • lunarul@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        18 days ago

        I thought AI was great at picking up context?

        I don’t know why you thought that. LLMs split your question into separate words and assigns scores to those words, then looks up answers relevant to those words. It has no idea of how those words are relevant to each other. That’s why LLMs couldn’t answer how many "r"s are in “strawberry”. They assigned the word “strawberry” a lower relevancy score in that question. The word “rescue” is probably treated the same way here.

    • DannyBoy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      18 days ago

      The context is clear to a human. If an LLM is giving advice to everybody who asks a question in Google, it needs to do a much better job at giving responses.

      • bluewing@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        18 days ago

        Bought in a grocery store - see squab - they are usually clean and prepped for cooking. So while the de-boning instructions were not good, the AI wasn’t technically wrong.

        But while a human can make the same mistake and many here just assume the question was about how to wash a rescued pigeon - maybe that’s not the original intent - what human can do that AI cannot is to ask for clarification to the original question and intent of the question. We do this kind of thing every day.

        At the very best, AI can only supply multiple different answers if a poorly worded question is asked or it misunderstands something in the original question, (they seem to be very bad at even that or simply can’t do it at all). And we would need to be able to choose the correct answer from several provided.

  • EtherWhack@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    18 days ago

    I mean, if they were actually “clean” and had a healthy diet compared to what they eat in urban areas, they could make an awesome protein source for the budget minded.

    • Kaboom@reddthat.com
      link
      fedilink
      arrow-up
      0
      ·
      18 days ago

      You theoretically could, but small birds like that have very little meat on their bones. Most hunt duck or turkey for a reason, the bigger birds have more meat.

      • EtherWhack@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        18 days ago

        Though, you wouldn’t want to eat one you “recovered” from an urban area that’s had an unknown diet, due to all the toxins it may have accumulated in its body.

  • RagnarokOnline@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    18 days ago

    lol, definitely missed some important context.

    I guess it thought OOP meant “clean” as in how do you dress the bird before you cook it. (As in: “clean a fish” means to filet a fish and prep it for cooking.)

    • Empricorn@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      17 days ago

      It literally doesn’t matter. When the most-used search engine on the planet automatically suggests these specific actions without you even clicking on a specific site? We’re fucked. We had the chance to break up monopolies like Google, Microsoft and Facebook. We didn’t take it…

    • jaybone@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      18 days ago

      But first it said they are usually clean. So that can’t be the context. If there was a context. But there is no context because AI is fucking stupid and all these c-suite assholes pushing it like their last bowel movement will be eating crow off of their golden parakeet about two years from now when all this nonsense finally goes away and the new shiny thing is flashing around.

      • jj4211@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        18 days ago

        There are signs of three distinct interpretations in the result:

        • On topic, the concept of cleaning a wild bird you are trying to save
        • Preparing a store bought Turkey (removing a label)
        • Preparing a wild bird that is caught

        It’s actually a pretty good illustration of how AI assembles “information shaped text” and how smooth it can look and yet how dumb it can be about it. Unfortunately advocates will just say “I can’t get this specific thing wrong when I ask it or another LLM, so there’s no problem”, even as it gets other stuff wrong. It’s weird as you better be able to second guess the result, meaning you can never be confident in an answer you didn’t already know, but when that’s the case, it’s not that great for factual stuff.

        For “doesn’t matter” content, it may do fine (generated alternatives to stock photography, silly meme pictures, random prattle from background NPCs in a game), but for “stuff that matters”, Generative AI is frequently more of a headache than a help.

    • Lucidlethargy@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      17 days ago

      What the fuck are you talking about? Stop apologizing for AI, you clown.

      Do you remove the “label” in step one of cleaning a fish? Please, tell us all where that is.

  • WrenFeathers@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    18 days ago

    It’s all fun and games until someone gets hurt. They need to pull the plug on this shit and stop beta testing misinformation.

  • FelixCress@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    18 days ago

    They taste a lot like beef.

    You take the breasts and fry them on olive oil with a little bit of garlic and soy sauce. Delicious.

  • Lost_My_Mind@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    18 days ago

    I’m going to build a bunch of cyborgs, who follow orders exclusively via googleAI.

    I figure once I release about 4 billion of them into the world, either google stops doing evil shit, or they do REALLY evil shit. We shall see what happens…

    • mindaika@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      18 days ago

      It is kind of an Interesting idea: what does “statistical average morality” look like when it’s got a 3m tall power frame and a handheld howitzer?

    • Mesophar@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      18 days ago

      Eventually, it will. Because even with janky responses like that one, corporations will try to cost save everywhere they can. Is AI at the point where it will happen this year? Hell no! But don’t think it isn’t the direction they are trying to take it.

    • Thorry84@feddit.nl
      link
      fedilink
      arrow-up
      0
      ·
      18 days ago

      AI will take jobs when the shareholders think it will make them more money. This has very little if not nothing to do with how good said AI is at the job.

      • MonkderVierte@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        18 days ago

        Good. Hopefully the stock market will die before humanity does. A system that only serves the money is not sustainable.

  • theneverfox@pawb.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    16 days ago

    I find it hilarious that my personal AI, that can run on even a budget gaming PC, is far more reliable than most of these corporate ones 100x the size

    • jj4211@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      18 days ago

      I mean, not one a human would ever make.

      First off, the word “rescued” would have immediately made the context of “protect the pigeon” clear.

      Second, a “rescued pigeon” wouldn’t have a label on it, so it’s clearly mixing in something from likely a store bought turkey, but then the other steps don’t make sense either as those don’t apply either.

      A traditional search approach would not have made the mistake either. It would either have failed to find anything or found actual on topic results. It’s “clever” enough to genericize “pigeon” to “birds” and hit upon text related to birds from a grocery store and birds that you hunted and mix all that together in a coherent language but with content that is nonsense.

      In this case, hilarious, in other day to day situations, it’s maddening, as some professional colleague gets the same sort of nonsense but lacks knowledge to correct it and relays it as fact. Then when called out on the data was in fact so bad it wasted time, they just say ‘oh, lol, AI’ (they wanted to take credit for it if it worked, but can hide behind AI when it doesn’t).