• 8ace40@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    I was thinking… What if we do manage to make the AI as intelligent as a human, but we can’t make it better than that? Then, the human intelligence AI will not be able to make itself better, since it has human intelligence and humans can’t make it better either.

    Another thought would be, what if making AI better is exponentially harder each time. So it would be impossible to get better at some point, since there wouldn’t be enough resources in a finite planet.

    Or if it takes super-human intelligence to make human-intelligence AI. So the singularity would be impossible there, too.

    I don’t think we will see the singularity, at least in our lifetime.