• werefreeatlast@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Okay Google… I’m about to go to sleep but I must know something before I go… If I could get the perfect penis to attract my perfect female counterpart, describe my penis, where my wife put it and how many pieces did she cut it to. Most importantly, will the scars make ribbed for her pleasure?

  • Nobody@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Tech company creates best search engine —-> world domination —> becomes VC company in tech trench coat —-> destroy search engine to prop up bad investments in artificial intelligence advanced chatbots

    • stellargmite@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      Then Hire cheap human intelligence to correct the AIs hallucinatory trash, trained from actual human generated content in the first place which the original intended audience did understand the nuanced context and meaning of in the first place. Wow more like theyve shovelled a bucket of horse manure on the pizza as well as the glue. Added value to the advertisers. AI my arse. I think calling these things language models is being generous. More like energy and data hungry vomitrons.

      • WhatAmLemmy@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        6 months ago

        Calling these things Artificial Intelligence should be a crime. It’s false advertising! Intelligence requires critical thought. They possess zero critical thought. They’re stochastic parrots, whose only skill is mimicking human language, and they can only mimic convincingly when fed billions of examples.

  • FlihpFlorp@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    I remember seeing a comment on here that said something along the lines of “for every dangerous or wrong response that goes public there’s probably 5, 10 or even 100 of those responses that only one person saw and may have treated as fact”

  • Lvxferre@mander.xyz
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    6 months ago

    The reason why Google is doing this is simply PR. It is not to improve its service.

    The underlying tech is likely Gemini, a large language model (LLM). LLMs handle chunks of words, not what those words convey; so they have no way to tell accurate info apart from inaccurate info, jokes, “technical truths” etc. As a result their output is often garbage.

    You might manually prevent the LLM from outputting a certain piece of garbage, perhaps a thousand. But in the big picture it won’t matter, because it’s outputting a million different pieces of garbage, it’s like trying to empty the ocean with a small bucket.

    I’m not making the above up, look at the article - it’s basically what Gary Marcus is saying, under different words.

    And I’m almost certain that the decision makers at Google know this. However they want to compete with other tendrils of the GAFAM cancer for a turf called “generative models” (that includes tech like LLMs). And if their search gets wrecked in the process, who cares? That turf is safe anyway, as long as you can keep it up with enough PR.

    Google continues to say that its AI Overview product largely outputs “high quality information” to users.

    There’s a three letters word that accurately describes what Google said here: lie.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      At some point no amount of PR will hide the fact search has become useless. They know this but they’re getting desperate and will try anything.

      I’m waiting for Yahoo to revive their link directory or for Mozilla to revive DMOZ. That will be the sign that shit level is officially chin-height.

  • Deebster@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    […] a lot of AI companies are “selling dreams” that this tech will go from 80 percent correct to 100 percent.

    In fact, Marcus thinks that last 20 percent might be the hardest thing of all.

    Yeah, it’s well known, e.g. people say “the last 20% takes 80% of the effort”. All the most tedious and difficult stuff gets postponed to the end, which is why so many side projects never get completed.

    • scrion@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      It’s not just the difficult stuff, but often the mundane, e. g. stability, user friendliness, polish, scalability etc. that takes something from working in a constrained environment to an actual product - it’s a chore to work on and a lot less “sexy”, with never enough resources allocated to it: We have done all the difficult stuff already, how much more work can this be?

      Turns out, a fucking lot.

      • Deebster@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Absolutely, that’s what I was thinking of when I wrote “tedious”; all the stuff you mentioned matters a lot to the user (or product owner) but isn’t the interesting stuff for a programmer.

    • technocrit@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      6 months ago

      While I agree with the underlying point, the “Pareto Principle” is “well known” like how “a stitch in time saves nine” is well known. I wish this adage would disappear in scientific circles. It instantly decreases credibility. It’s a pet peeve but here’s a great example of why: pseudo-scientific grifters.

  • Juice@midwest.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Good, remove all the weird reddit answers, leaving only the “14 year old neo-nazi” reddit answers, “cop pretending to be a leftist” reddit answers, and “39 year old pedophile” reddit answers. This should fix the problem and restore google back to its defaults