• brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 day ago

    Oh and to answer this, specifically, Nvidia has been used in ML research forever. It goes back to 2008 and stuff like the desktop GTX 280. Maybe earlier.

    Most “AI accelerators” are basically the same thing these days: overgrown desktop GPUs. They have pixel shaders, ROPs, video encoders and everything, with the one partial exception beinf the AMD MI300X and beyond (which are missing ROPs).

    CPUs were used, too. In fact, Intel made specific server SKUs for giant AI users like Facebook. See: https://www.servethehome.com/facebook-introduces-next-gen-cooper-lake-intel-xeon-platforms/