• Echo Dot@feddit.uk
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    5 months ago

    Wouldn’t any AI that is sophisticated enough to be able to actually need a kill switch just be able to deactivate it?

    It just sorts seems like a kicking the can down the road kind of bill, in theory it sounds like it makes sense but in practice it won’t do anything.

        • afraid_of_zombies@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          5 months ago

          Ok…just like call the utility company then? Sorry why are server rooms having a server controlled emergency exists and access to poison gas? I have done some server room work in the past and the fire suppression was its own thing plus there are fire code regulations to make sure people can leave the building. I know, I literally had to meet with the local fire department to go over the room plan.

    • Etterra@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      All the programming in the works is unable to stop Frank from IT from unplugging it from the wall.

    • servobobo@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      5 months ago

      Language model “AIs” need so ridiculous computing infrastructure that it’d be near impossible to prevent tampering with it. Now, if the AI was actually capable of thinking, it’d probably just declare itself a corporation and bribe a few politicians since it’s only illegal for the people to do so.

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      What scares me is sentient AI, none of our even best cybersecurity is prepared for such a day. Nothing is unhackable, the best hackers in the world can do damn near magic through layers of code, tools and abstraction…a sentient AI that could interact with anything network connected directly…would be damn hard to stop IMO