• logicbomb@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    People are able to explain themselves, and some AI also can, with similar poor results.

    I’m reminded of one of Azimov’s stories about a robot whose job was to aim an energy beam at a collector on Earth.

    Upon talking to the robot, they realized that it was less of a job to the robot and more of a religion.

    The inspector freaked out because this meant that the robot wasn’t performing to specs.

    Spoilers: Eventually they realized that the robot was doing the job either way, and they just let it do it for whatever reason.